Evidence for Learning: Q Project Launch – Brings schools and research evidence closer

Q Project Launch – Brings schools and research evidence closer

The Q Project is looking at how we can improve education and what role research can play in that process.
Author
E4L
E4L

On Monday 26 November, Evidence for Learning attended the launch of the Q Project. This is a new Monash University five-year project to help schools to use research evidence to improve teaching and learning, made possible by a $6.3 million grant from the Paul Ramsay Foundation. Below, we provide edited transcripts from the project lead and a project partner who both also have close connections to Evidence for Learning’s work.

Blog •5 minutes •

Associate Professor Mark Rickinson

Q Project Director – Monash University Faculty of Education

Research Use & Evaluation Committee member – Evidence for Learning

I want to start by thanking you all for being here.

One of the main aspirations of the Q Project is to bring together practitioners, policy-makers, researchers, research brokers, and other players in order to spark new conversations about the role and use of research evidence in education. So thank you for helping us take one step towards that aspiration today.

Now you’ve heard a little bit about what the Q Project is about and so I wanted to explain why we think it is important.

From a policy perspective, recent years have seen increasing recognition for importance of evidence in education. The recent Gonski 2.0 report, for example, makes clear that: To sustain continuous improvement, Australian schools need access to valid and reliable evidence’.

From the perspective of practice, you don’t need to spend long with school leaders like Pitsa Binion and school teachers like Ollie Lovell to realise that;

The Q Project will respond to this by approaching evidence use as a professional learning challenge (not an information transfer challenge), and then working with educators to open up and unpack what is involved in using evidence well in different school contexts.

Finally, from the perspective of research, you might be surprised to hear there have been hardly any recent empirical studies of research use in Australia schools. In other words, we know very little about whether and how Australian educators are using research evidence.

The Q Project will therefore be addressing a significant gap in our current understanding, but importantly, it will be doing this in a distinctive way.

A number of partner organisations were critical to this project becoming a possibility:

  • Matthew Deeble, Danielle Toon and Tanya Vaughan at Evidence for Learning
  • Liam Smith and colleagues at BehaviourWorks Australia
  • Simon Kent, Bruce Armstrong, Stephen Fraser and others at the Victorian Department of Education and Training. Their equivalents in South Australia (Martin Westwell), Queensland (Deb Kember and Ang Ferguson) and New South Wales (Jenny Donovan and Rob Stevens)
  • Neil Barker, Matt Tibble and Chris Dickinson at the Bastow Institute of Educational Leadership
  • Shani Prendergast at Catholic Education Melbourne
  • Clinton Milroy and Christion Meyer at the Australian Institute for Teaching and School Leadership (AITSL)
  • A number of school principals and teachers.

Ollie Lovell

Head of Senior Maths – Sunshine College

Founder – Education Research Reading Room (ERRR Podcast)

I’d like to start today by painting a picture for you. A picture of a passionate, yet busy, school leader, trying to address a challenge within her school. Let’s call her Marnee.

Marnee has identified that the literacy outcomes within her school are lower than they should be, and action needs to be taken to remedy this. She’s approached by an organisation, let’s call them literacy winners’, and after a phone call, a meeting, and a visit to another school in which another principal speaks passionately about the program, she decides to give it a go.

Fast forward two years and, as Marnee looks back over the process, she wonders if she’s made the right decision. Was literacy winners’ the right approach for her students, and how does she know whether or not to stick with it, or to change tack?

Considering this scenario, which is very similar to what I’ve seen several times throughout my time in schools, I feel that we can ask ourselves a similar question… How do we know whether or not Marnee made a good choice?’

If we break it down further, to what we could call the various phases of the decision-making process: the first phone call, the first meeting, the school visit, the program implementation, and now the monitoring of the program, what would constitute Quality evidence use’ at each of these key decision-making points along the way?

  • What kind of questions should Marnee have asked during that first phone call?
  • What kind of evidence should she have asked for during that first meeting?
  • Which metrics should she be considering when evaluating the program in her own context?
  • And on what grounds should she base the decision whether or not to stick with, or pivot from, the literacy winners approach?

Through working in schools, reading, and interviewing education thought leaders for the Education Research Reading Room podcast it’s come to my attention that;

Quality evidence use in education is different from medicine and other areas. In medicine the goals are pretty well defined, doctors have traditionally tried to keep people alive, and free from pain. But it isn’t quite as clear cut in education. Ask 100 educators what the purpose of education is, and you’ll get 100 different answers. Now, this isn’t something to lament, it is in fact something to celebrate, but it’s also something that we need to actively acknowledge and seek to tackle head on.

As a profession, we’ve spent much time identifying what is and isn’t good evidence. We’ve argued about the relevance of randomised controlled trials to education, and I personally have spent a lot of time in the past year or so trying to work out whether or not ranking aggregated effect sizes makes any sense at all. But these factors, what we could call defining Quality evidence’, are only a small part of the Quality evidence use picture.

To put this another way, and to quote Adrian Simpson,

This, more complex, set of questions addresses not just the reductionist cause and effect relationships that much scientific research rightly strives to determine, but also incorporates the multiple, and sometimes contradicting values, goals, and opportunity costs that educators have the joy of contending with each day.

It aims to capture what such quality evidence use looks like in practice and it aims to develop professional learning to empower educators, educators like Marnee, to ask better questions at the key decision points throughout their journeys.

Giving educators the tools to be more critical consumers of research will help the profession to guard against the fads, swindlers, and dead ends that have plagued us to date, and to embrace the approaches that will more robustly support student learning into the future. And this can only be a good thing.

Watch a video about the Q Project

The Q PROJECT - QUALITY USE OF EVIDENCE DRIVING QUALITY EDUCATION

Further Information

Interested schools and organisations can contact edu-​qproject.​enquiries@​monash.​edu to find out how to get involved in the Q Project.