Research Literacy: Literacy Research

Screen Shot 2015-09-05 at 23.49.12
Four examples of how research on literacy is conducted and presented.

This post contains the key ideas and materials from my presentation at ResearchEd 2015.   It seems to me that a high proportion of discussion about research in education doesn’t actually make reference to specific research evidence or trials.  I’ve decided that every time I give a talk at ResearchEd, I will look at the details behind some studies to see what they tell us.  This is partly to learn about the subject matter but also to explore the business of conducting educational research and our capacity to engage with it and trust the findings.

Last year, I whizzed through several studies of different kinds – as featured in this Do Your Homework post.  This year I focused on literacy catch-up schemes as this is an area that we’re investigating at Highbury Grove.

The slides are here – although, as ever, I don’t know how coherent they are without the commentary:

My initial google search led me directly to this DFE document 2012:

There is a table in the DFE publication that ranks interventions by their effect size.  It’s significant that most of the top scoring studies feature 1:1 schemes – ie where students are taught individually.  That’s a recurring finding through my explorations.    Way out in front is a study called Literacy Acceleration in Cornwall with an effect size of 1.14.  I wanted to find out more so I followed the citation:  Brooks (2007)

This turns out to be the work of a researcher called Greg Brooks who is basically the leading expert in the field, responsible for major reviews of all literacy studies in the UK: Here is the link to the pdf.

In this publication Greg gives comprehensive details of the all the studies, their methodologies and their effect sizes.  He also uses a scale called Ratio Gain or RG which is nicely intuitive measure.  Basically, if students make 12 months progress on average on reading age tests 6 months apart, the RG has a value of 2.  Significantly, Greg’s reports are highly objective and he does not engage in any ranking; the studies are in alphabetical order.  The page for the Literacy Acceleration is included in the slides. Sure enough, the effect size is 1.14.  However, looking at the detail it seems that the study was conducted by a PhD student (Lingard) in 1994 with just 26 students in one school.  In fact the same student conducted other trials with the same process that scored 0.37 and 0.23; actually these are also included in the DFE list that I started with.  So…. it turns out that the top ranked literacy intervention in the DFE list is very small study that is over 20 years old on the scale that any teacher could conduct in their own classroom; the sort of scale that is often dismissed as action research that doesn’t really count!   The DFE official who wrote the publication had simply lifted the effect sizes and produced a ranked table without even trying to explore the substance between the trials.  Teachers beware! We can’t simply trust DFE publications… they’re too lazy to do the work needed to inform our decision.

I then noted that Greg Brooks had updated the 2007 survey in 2013. Here is the link  and, if you’re interested in educational research, I recommend reading this. It’s a superb piece of work.

In this update, surprise surprise, Literacy Acceleration has been removed.  It didn’t meet Greg’s criteria (shown in my slides) on many fronts including the fact that it has ceased to exist and the sample was below 30 students.  The slides show some of the details of the secondary intervention studies.  It is striking to me how few studies have ever been done on a large scale. Most studies involve fewer than 100 students; some span only a few months; it’s rare for a study to go over a year.   That shows how little emphasis we place on these studies in general; some quite famous and widely used schemes don’t seem to have been subjected to high level studies at all.  If you read through all the studies it seems that the best evidence comes from 1:1 approaches where students are taught to read directly – expensive perhaps, but effective.  Greg Brooks makes some helpful overall conclusions which are in the slides are reproduced here:



The first one is obvious but actually a very powerful statement: ordinary teaching (no treatment) does not help children with literacy difficulties to catch up!  In other words – we have to do something!

I was interested to note that, in 2013, despite it being one of the most widely used schemes, Accelerated Reader, does not feature in this publication. It was dropped from the 2007 version because the studies didn’t meet the criteria.  Most were simply too small or showed results that weren’t strong enough to be included!  We have used Accelerated Reader for a few years in common with hundreds of UK schools and our staff have had various concerns about it; there are too many variables to control, too many students don’t make much progress, the IT needs limit how much access any one student gets  and, significantly, where it works best is where we have TAs who can effectively give 1:1 support!   I was delighted to learn that EEF has undertaken a study published only this year in February:

Again, it is well worth reading simply as an example of educational research.  It involved four schools and around 380 students in a formal RCT supervised by EEF researchers.  The report explores the effectiveness of Accelerated Reader but also makes numerous references to the process of conducting the trial itself.  It illustrates just how complex this can be.  The key findings are included in the slides – both in data terms and the researchers’ overall comments.  In essence the trial did show a positive effect but only in what I would consider optimal conditions.  Many of the concerns we have are reflected in the comments in the report.  This study does not sway me in my view that Accelerated Reader isn’t sufficiently targeted or effective – mainly because it is designed to incentivise reading, not to teach reading: a crucial difference.  We’ve decided to phase it out.

My presentation concluded by looking at a programme we’ve decided to explore from this year onwards: Thinking Reading.  Although I didn’t mention this in my presentation, I think the original recommendation came from David Didau who ran a literacy session with our Learning Support team last term.  He wasn’t giving it a hard sell – he simply suggested it might be worth a look.  The scheme involves TAs being trained in how to deliver a specific 1:1 programme that teaches reading and spelling systematically, gathering assessment information every session: progress is tracked very closely.  Thinking Reading is included in Greg Brooks 2013 review. He suggests that their RG of 5 delivered over an unusually long period of 14 months, with 44 students represents an impressive outcome – albeit based on data provided by the programme itself, (in common with many other studies.) Some individual results are very significant -see the slides.  It is the fact that it can be tested for impact very directly that has sold it to me.  I’ll know if it works.  Also, I’m convinced that investing in intensive support ( 3 x 30 mins per week) for the most needy students has far greater hope of delivering value than spreading a programme like Accelerated Reader more thinly across more students; too many slip through.

The presentation ended in a fairly bizarre manner as it turned out that Dianne and James Murphy who run Thinking Reading were both in the session! They didn’t know I’d be talking about their scheme… It was great to talk afterwards and discuss things in more detail.

I ended the session by encouraging more people to accept invitations to get involved with formal trials.  We’ll be contributing our Thinking Reading data to the overall evaluation of the scheme. We could have been doing the same with Accelerated Reader. Really, for too long, we’ve been guessing our way through and we need to be much more systematic in the way we evaluate initiatives like this – both at school level and nationally.

Screen shot 2015-09-06 at 10.08.55
James Murphy by the door.

Screen shot 2015-09-06 at 10.07.20

Screen shot 2015-09-06 at 10.05.59

Job done.

UPDATE April 2016: With thanks to Dianne Murphy ( @ThinkReadTweet from Thinking Reading) here is the very latest report from Prof Brooks:

Screen Shot 2016-04-17 at 08.50.18
Click for full pdf. 


  1. Interesting post. Does this also remind us that the role parents and guardians play in 1-2-1 reading with children is invaluable. Can we make bedtime stories the law???


    • Absolutely agree on parental role in literacy – and as a dad with Yr 6/8 sons, who was only recently immersed in the education thing I am racked with guilt.

      If you haven’t, do watch this powerful film from The Save the Children’s Read On Get On campaign:

      And send the link home to every parent or show it on parent evenings.

      Or if you’re feeling very clever… put this image on a newsletter home to parents:

      When viewed in the Aurasma app on iPhone/iPad or Android the video will play on the page using Augmented Reality.

      Just run this link twice – once to install the app, and second time to follow the channel.

      You may even find the children go home and show the film to parents for you! 😉

      Liked by 1 person

  2. An interesting post, thanks. I always find your blog informative and entertaining. I have used Greg Brooks’s excellent summary of research findings on what works for Literacy Difficulties for a long time and am passionate about using high quality, evidence based, and rigorously evaluated interventions to support learners. So, I was interested to find out more about Thinking Reading and decided to have a look at the website. Their site is full of interesting links to research and although it contains several case studies (which give no idea about the context of the pupils, the schools they were in, the number of lessons and how they were delivered or how the progress is measured), I can’t seem to find any systematic research trials which show impact of their approach. Am I missing something?


  3. A really engaging and interesting talk thanks Tom and doing a great job helping Tom Bennett on his crusade to ‘debunk bad research’. I was the event on Saturday specifically looking for literacy stories and yours stood out a mile as well as a great talk by @katie_s_ashford

    One thing that struck me from your talk was the proliferation of 1:1 interventions and their effectiveness, which is probably not too surprising given the luxury and cost of such focussed effort by trained Teaching Assistants. As you said yourself with the very high rate of pupil premium in your school (think you said over 70%) you are able to fund this and do, it being so important to raise the literacy of struggling readers so they can access the wider curriculum.

    Many schools fall between a rock and a hard place in that their families may be in work and not eligible for PP yet face similar literacy challenges, indeed I was in South Wales today where this is very much the case and schools simply can’t afford 1:1 interventions.

    Commercial break coming up here now (which I hope is appropriate given the interest we have received) but I would like to mention the literacy intervention launched back in January by a company I am working with called ReadingWise that has been shortlisted for BETT and ERA SEN awards.

    Sadly we missed the boat for the reports you reference here but ReadingWise English has been shown to raise average reading age by 9.7 months after just 20 hours.

    The key thing is that is offers training and tools to allow TAs to oversee and support learners using it in groups of ten learners to one TA.

    We have secondary schools using it, usually in Year 7 for transition and catchup but here’s a film we released recently showing it in use at a primary school in Southampton that tell us they have seen reading age increase of a year and nine months after just 20 hours.

    We’d obviously be happy to show it to anyone who either can’t afford 1:1 interventions or for whom these or other approaches simply haven’t worked, as was the case at Calmore Junior School, featured in the above film.

    We offer online demonstrations as well as F2F overviews and TA training sessions as shown in this blog post from a school in Cambridge recently:


    • Thanks a lot for this. It sounds very interesting. I guess it’s pretty obvious that 1:1 scores comparatively highly but value for money is also an issue and I’ll be paying attention to that. Keep us posted with trials and case studies. Tom


Leave a Reply

Fill in your details below or click an icon to log in: Logo

You are commenting using your account. Log Out /  Change )

Twitter picture

You are commenting using your Twitter account. Log Out /  Change )

Facebook photo

You are commenting using your Facebook account. Log Out /  Change )

Connecting to %s