The folly of narrow newspaper league tables.

As a long-standing reader of Ben Goldacre’s ‘Bad Science’ column, I was under the impression that the Guardian was a champion of the idea that it is important to seek the truth through a rigorous examination of evidence.  I am therefore rather disappointed by the approach they have taken in reporting achievement at A Level.  In asking only for the % of grades at A*-A they are fueling the idea that achievement can be captured in a single figure.  Of course, for all schools, it is much more complicated than that.  Do they assume that readers are not interested in the full story or that they might not understand the detail? Or is it that they are rather bored by messy detail and really just want to keep it simple for their clever graphics and maps? The Times at least has two figures, but gives no indication of how many A levels each student has taken.  However, they go straight in with a top-to-bottom ranking.  It is interesting, seductive even, looking down the list to see where schools are placed.  But it is all folly!  Don’t get sucked in…..

As a Head, I need to look at several data points before I know how well my own school has done.  This is complex enough and it certainly isn’t meaningful to compare my school with any other unless all the same data points are available.   It is a dangerous game to start ranking schools in any case but doing it based on %A*-A  or%A*-B alone is not just narrow – it is actually meaningless. We put out a press release to give an idea of the stories we have to tell; the newspaper data requests don’t go anywhere near to capturing this.

At KEGS, the figures we look at are:

1) the average total points per student. This is the only figure that takes account of every single grade achieved by every student.  Of course an advantage is gained by taking additional qualifications but, at KEGS, we think breadth is important.  It is better to get AAAB than AAA; certainly AAABC and arguably even AABB represent better educational outcomes than AAA.  (Without the points total, you are just reinforcing the key weakness in the A level system – that it is narrow. Schools that focus on only three A levels are more likely to get a high % score in terms of A/A* – but does that represent the best possible education? We’d say no. )

2) the % of grades at %A*-B. This gives a good indication of the quality of grades overall.  B grades are important.  BBB gets students into good universities and it is pretty insulting to students with Bs to suggest that these grades don’t count!  However it is important to see this measure alongside points because whilst the quality of each grade clearly matters, the overall breadth or volume of qualifications sets the context.

3) the % of grades at %A*-A and just A*. These are also useful to look purely at the top end but focusing on this measure alone does not tell the full story at all.  (As if, below A, grades B-E are equal in ‘failing’ to meet the standard!)

4) trends over time and ALPS data.  All educational achievement needs to be referenced against a starting point.  As well as looking at how we’ve done on all measures against previous years, a key indicator for us is the Value Added score from the ALPS system.  Our ALPS scores tells us how well our students have done compared to the national trend for students of the same ability profile. Raw outcomes say nothing about how much difference a school makes; only value-added information can give an idea of this. The DfE valued added system for A level was opaque and convoluted but ALPS is very straightforward and works really well.

5) Finally, the key outcome for us is the proportion of our students gaining access to their first choice universities.  Ultimately, this is what we are aiming to achieve.  Even here, it is important to look at every student – not just those going into Medicine or to Oxbridge.

As it happens, this year is the best year we’ve ever had in terms of %A*-B and A*; it is the second best we’ve had in terms of total points score and %A*-A.  It is our best year for many years in terms of first choice university places.  However, I have no intention of putting our data into any of the newspaper league tables or maps – because it appears that no-one is interested in the detail.

In January the full performance tables are published and I think it is best to wait until then rather than publishing dodgy data that can do no more than mislead. In the mean time, let’s not fall for the idea that School A has done any better than School B because of the relative position in the table.  There simply isn’t enough information to make that assessment.  I know some schools that won’t allow students to continue to A2 without an A at AS;  by some miracle of data they lie pretty near the top of the tables!!

We get a bit of flack for opting out but, having seen the tables this week, despite getting some of our best ever results, I am glad we are not in them, participating in the perpetuation of the delusion that these tables matter. Remember – it is optional! No-one makes you do it.

(Aside 1: The self-entry process is also rather dubious.  On my very first day at KEGS in August 2008, I was fielding calls from people asking how I felt coming into a school with the best A level results in the country.  We were in the Independent with a photo and a quote.  A week later, I realised that the figures didn’t add up; something was wrong. It turns out that the Assistant Head at the time had entered in the wrong figures by mistake.  Farcical!)

(Aside 2: At GCSE the newspaper tables often start with 100% 5A*-C inc EM. Last year, two of our students suffered major mental health issues – both in rather tragic circumstances. Neither took any exams and, consequently, we were not in the 100% category – as would normally be the case for a selective school like ours.  Despite strong performance by other measures, the local headline was ‘KEGS drops out of top 200’. Our ranking had sunk like a stone. Another reason not to play this foolish game.)

Update 2013

The story is the same thear year.  We’ve been asked for A/A*% in the Guardian and A*-B % in the Telegraph.  We haven’t provided either.  It remains a folly. The use of single measures is a ludicrous way to compare schools.  This is exemplified in this story about Haringey schools.

http://www.tottenhamjournal.co.uk/news/wood_green_secondary_makes_history_with_borough_s_joint_best_a_level_results_fortismere_st_thomas_more_catholic_school_1_2338273

The text provides information that the figures are very different at A/B for two schools that are the same for A-C grades but still an exaggerated claim is made about the schools’ relative standing.   The worse thing is that the hyperbole detracts from a genuinely impressive improvement story. We could also do with some information about equivalences which I know play a big part in the GCSE equivalent in this story.

3 comments

  1. Ask one of your Year 13 students aiming for an “AAA” Department at Bristol, Warwick or Durham whether they would want AAA or AABB. It is easy to criticize the narrowness of A-levels and the NC but to a certain extent one has to “play the game”.

    I think I was the only student at my Essex comprehensive to attempt 4 A-levels (Politics, History, Geography and English Lit) not to mention General Studies and an AS in Critical Thinking. I lasted until the dying weeks of year 13 before Geography was ditched as I realised all the additional workload meant was a risk of me missing my AAB offer. In the end I got AAACc.

    Personally I’m quite a fan of EPQ as a way of adding “breadth” to the curriculum though I imagine it is quite a bit of work for teachers to administer and make it work. “Spoon-fed” A-levels simply didn’t teach me the kind of independent learning and research skills I needed to succeed at university. Getting Year 13s doing university level essays via EPQ would act as a brilliant preparation in my view and even If it goes wrong well you have still got 3 A-levels to fall back on.

    Like

Leave a comment