ResearchEd14: The EEF…with great budgets come great responsibility.

“If ResearchEd had existed in 2011, we would have launched the toolkit here.”

So said Dr. Lee Elliot Major in the opening session of this year’s conference. The aims of both organisations are similar – although, as more than one speaker pointed out, the budgets certainly aren’t – working out what works in schools and making sure practitioners’ decisions are based on evidence. It should be noted that ‘what works’, in this context, relates to improved academic attainment. Whether or not this is a valid outcome, or one that is shared by all practitioners, is explored by Robert Peal here, but for the purposes of this post I’m going to assume that everyone is on board with undertaking research to see what best raises academic attainment. 

What I want to argue is this: if the EEF, through large research projects, identify incredibly effective practice(s) in schools, do they have a moral imperative to prescribe these practices? I would have thought it uncontroversial to suggest that the answer is in the affirmative, and at last week’s conference asked Dr Major:

“As your work continues, and your evidence base increases, you will presumably be able to more reliably assert that one practice or another (for example peer tutoring) is the most effective in helping students make progress. Won’t the EEF become more prescriptive, almost quasi-policymakers, to ensure schools use such practices?”

“No,” Dr. Major replied emphatically. “Our role is to undertake research and present the results. We would never say to a school you need to do this or that. That’s not our role.”


Prescription is bundled up with the ‘bad old days’. But it isn’t the prescriptive part that is the problem, it’s that it was uninformed.

This seemed a measured enough response, and I think that I may have been the only person in the audience disappointed with it. After all, we don’t want to go back to the ‘bad old days’ of national prescription. And besides, it’s not in the EEF’s mandate to tell teachers how to teach.

But the goal of ResearchEd, it seems to me, is to ensure school leaders and teachers are making decisions based on evidence. Having only entered the profession recently, I was delighted to discover the toolkit. I didn’t need to weigh up conflicting advice from experienced teachers (“You need to put your children in ability sets” / “Mixed ability groups are the best way”). I had a shortcut to what actually works in the classroom. But what if my senior leadership team disagree with my evidence based approach?

It is possible that the evidence will be ignored simply because it conflicts with the ideologies or philosophies of senior leadership teams. Consider, for example, Ignaz Semmelweis, who discovered that hand-washing in hospitals dramatically reduced mortality rates. He collected data and published his findings, but it wasn’t until years after his death that the practice was fully adopted. If, following this analogy, the EEF are Semmelweis, is it acceptable for them to refuse to prescribe hand-washing to schools, stating that they have published the data and it is up to schools to decide what to do with it?

I don’t think that it is.



  1. EEF cannot be prescriptive in this way – the toolkit is simply one, allbeit large, analysis of various meta-analysis on a particular range of educational interventions. It’s methodology and it’s scope is finite – it doesn’t take in anything like all the relevant research – and that’s not a criticism, it’s boundaries need to be clear in order for the metric overall to remain rigourous. As they described in the talk on Sat, they are engaged in multiple small-scale follow up studies in school – this is because, as much as we would wish it to, educational interventions do not follow the strict medical health model – even much healthcare doesn’t follow a strict medical health model! The toolkit is not, could never be, designed to be prescriptive in the way you describe because teaching and learning is so massively reliant on context. There would never be a level of certainly about the value of one particular intervention high enough to merit a moral imperative to impliment it. The toolkit can give you a starting point, but I think this is why it’s so vital for teachers to be active and critical *consumers* of research, so they can contextualise this starting point with other relevant research (perhaps single experimental studies or case studies that didn’t feed into the toolkit’s inital analysis) and their own professional experience.

    1. Thanks for your comment,

      You’re right, teaching and learning are heavily dependent on context. This is the reason that making general claims on the back of research into specific interventions (in specific contexts) is difficult. But I don’t think that it’s impossible, and the meta-analyses included in the EEF’s breakdown is one way of mitigating these issues (to an extent).

      Perhaps the EEF’s findings are not, at the present moment, underpinned by a sufficiently generalisable evidence base. But supposedly this is what they are moving towards. If not, then huge amounts of money are being spent to simply celebrate successful work in one school which is then attributed to a broad intervention name (e.g. feedback).

      As the evidence base increases, I argue, the EEF should identify with increasing confidence the specific mechanics of what makes x practice or intervention work. Again, if this isn’t a projective goal, then what on earth is all of the money being spent on?

      NICE works to assess and evaluate the efficacy of particular practices and and medications and issues guidance on the back of this. It isn’t perfect, but it means that the medical profession is informed by evidence and consistent throughout the country. Doctors are still able to exercise professional judgement, but if they go against the research, they may well find themselves having to argue why.

      Research in education is messy, but if we concede that this messiness and context dependency makes (even partially) conclusive research impossible, then why are we funding an organisation trying to achieve just that?

Leave a Reply

Fill in your details below or click an icon to log in: Logo

You are commenting using your account. Log Out / Change )

Twitter picture

You are commenting using your Twitter account. Log Out / Change )

Facebook photo

You are commenting using your Facebook account. Log Out / Change )

Google+ photo

You are commenting using your Google+ account. Log Out / Change )

Connecting to %s