Data-drops. Get some perspective.

This tweet below is one my most-ever liked and RT’d tweets   It clearly resonates with people.  It’s worth exploring the reasons why that might be.  Some of the reasons are highlighted in the thread that follows and some of the replies.

A lot the issues are covered in a previous post about what I suggest needs to be a paradigm shift in assessment thinking.

Some aspects are covered in my recent blog post for the Guardian Teacher Network

screen-shot-2018-01-28-at-12-51-13.png
Click to read the article 

I’ve also covered a lot of this ground in Chapter 5 of The Learning Rainforest

Here, I’ll just reiterate why half-termly data drops are a terrible idea:

To be clear, I am talking about the process, common to lots of secondary schools, where teachers are required to feed progress and/or attainment information in a common format into a centralised data base.

Distort curriculum flow to generate assessment info

In order to feed the data machine teachers are meant to have some form of assessment on which to base the data they are entering.  If these come every six weeks, it means needing to tailor the curriculum material and flow of lessons to that students can be assessed at a high enough frequency to match the data collection process.  This is the tail wagging the dog, especially when this is imposed on all subjects regardless of their natural assessment mode (i.e. maths tests vs English assessment criteria) or how many lessons are taught (i.e. Science vs weekly RE lessons).

Tell us nothing about actual learning needs.

If the data machine says Grade B, Grade 5, Exceeding, Working Towards, Secure, 70%  – or anything else, nobody knows what any student needs to work on.  So the data itself doesn’t inform teaching.  That information is already there in the raw material that informed the data-drop, which means the data-drop itself doesn’t add anything.  There is unlikely to be the level of moderation required to know that a 5 in History means the same as it does in Science or that a 7/Secure in English means the student is doing better than they are in Maths where the grade entered is 6 or ‘Emerging’ – or whatever.   The macro data sets up false comparisons.

Overshadow formative assessment.

Formative assessment should be happening all the time.  This is where the real action is.  This is the important stuff, where the learning happens, where the real meaning lies. If we institutionalise data-drops to the level where they matter enough to be done every half-term, – with all the managerial hullabaloo that ensues, chasing people up, fussing over missed grades, missed deadlines and how it all looks to parents – then we’re really losing perspective on what counts.  Most the data-dropped could be erased with no impact on learning.

Inefficient way to flag concerns.

If I’m concerned about my own students, I will know already; I don’t need to flag it via the sledgehammer of a data system. I just need to act.  If I am a leader worried about the performance of a teacher or class somewhere in the school, the centralised data is too processed to tell me the true picture – I need to see the raw data, so I need to keep close to  that.  If I want to know who is a cause for concern in Year 8 or Year 9, I could ask for teachers to tell me their names.  In any case, if the data machinery is the means by which a previously unknown concern is flagged (every six weeks) then things are going wrong elsewhere.  Termly data would easily be enough to shine a light in the darkest corners if that is needed,

Nobody looks at most of it.

In my experience, in a typical secondary school, the scale of the data being collected is so massive as to be unwieldy and unworkable:  200 students per year, x 10 subjects = 2000. Multiply by 5 year groups = 10,000.  Add an ‘attitude to learning grade’ = 20,000.  Add a predicted attainment grade’ = 30,000.  And so on.  Light it all up in Red , Amber and Green – it’s massive matrix of information, each piece of which is already stored somewhere else where it is more useful.  How much time do leaders have to explore the patterns to add value to what is already known?  Not enough.  Most the data is wasted….

MASSIVE Workload

Let’s not pretend that data drops are quick and easy.  Each entry point requires thought, time, and multiple mouse clicks; engaging with the data takes time…..  If you half the number of data drops you half the workload for almost exactly the same meaning.  It’s a no-brainer.

And lots of you seem to agree.  So, come on people, get it in perspective.  If you think your school’s success is driven by high frequency data drops, I’m going to challenge you and say, no, it’s not. That’s an illusion/delusion you have created and your previous assertions of the value of it may be the reason you feel  you can’t now change.  But saving face isn’t a good reason.  Don’t defend it – test it out and see what happens.

11 comments

  1. This is great advice Tom! I’m a Data Manager in a secondary school that currently does three data drops a year (and occasionally an additional Year 11 check,) staff are so much happier and the data actually means something (well relatively speaking!) Most other schools I’ve worked in have done six drops a year and it is a constant treadmill, leaders looking for constant improvements – I’ve often argued this approach means most teachers will simply add an extra sub-level (in the Levels days) or change a 5= to a 5+ to ‘show progress.’ Then the analysis of such data becomes even more meaningless – no one has the time to look or act upon it with six drops a year across five year groups! I’ve even just argued this point in an interview where the headteacher was perplexed why you wouldn’t do six a year – madness!

    Liked by 1 person

  2. Agree with all your points. On top of this, consider what happens if a teacher flags up a student as under-performing against targets. SLT are going to ask what action is the teacher taking. What teacher is going to welcome that extra workload landing on their head from SLT? Anyway, how does the average teacher know what those targets for the future mean relative to current teaching? Much easier to just say the student is on target. Just make it up! So the data entered by teachers is generally inaccurate anyway and a complete waste of time. The data is mainly there so that the headteacher can give a regular report to governors saying everything is fine, and here is the data to prove it.

    Liked by 1 person

  3. As the data bod at a fairly large school, I should find this sort of talk threatening, but in reality, it is spot on and sending me looking for better ways to make this stuff work. 6 data drops a year gives us lots of detail, but what that detail is of, becomes a bit of a mystery. Time to use my number powers for good and try to come up with something much more effective.

    Like

  4. An interesting challenge on data collection. And yet in my experience as a head of department I found that teachers did benefit from an external evaluation (I.e. Target). I found teachers could miss under-performers. A high target and a low working at does help flag up this particular issue, and this helps teachers priorities on who is most underachieving. Sometimes teachers focus exclusively on who is generally struggling – those students may well be making good progress given their starting point, whilst the “lazy but bright” students may be happily turning out average work, keeping clear of teacher’s focus, when they are seriously undrachieving. For me, that is a good argument in favour of data drops of some kind, and to be fair your point is more that six drops per year is too much, and I follow your line of argument. A thought-provoking post!

    Liked by 1 person

  5. An hour later, Tom, and I have completed my own post in response to yours – such a stimulating post you wrote! My conclusion was that data spurs can spur on teacher and student progress when it is combined with the softer skills of relationship / personalised intervention. It is a blunt instrument but it can have meaning (I know you weren’t denying this). The numbers are symbols that can flag up issues for skilful exploring. But if the exploring is not done skilfully then the data input can indeed be a waste of time. And I am sure that is the exploring is done destructively, then the whole process can be a waste of time, too! Anyway, if you have time to read it, I post @ fieldsofthemindeducation.wordpress.com , and of course I would be grateful for your comment! Best wishes – I always enjoy your ideas. Michael (aka @fieldsofmindedu)

    Like

Leave a comment