In the last week, a series of events and meetings at my school signalled collectively that we’ve turned a corner with our view of some key processes.
- How we evaluate and improve the quality of teaching overall.
- The role of lesson observations
- The way we regard our action research activities as a feature of self-evaluation and CPD within a broader evidence-based professional culture
The major changes in our thinking include the following:
- We recognise that conclusions from one-off lesson observations are limited and, therefore, cannot be used in isolation as a meaningful way to evaluate what we do.
- We understand that learning is not always visible or measurable, and this needs to be recognised in the gathering of evidence to inform our evaluation of the impact of the strategies we use as we seek to maximise students’ learning.
- We have the confidence to apply evidence-based thinking to a critical evaluation of the prevailing orthodoxies around ‘good practice’, as determined by OfSTED criteria or by our own ingrained biases. We don’t need to accept the letter of what OfSTED might say about good practice in our context; we might just know better than anyone what works and what doesn’t.
To some extent, these changes have been developing at KEGS for some time. They are signalled in our longitudinal Departmental Review approach, described in this post, and in the way we’ve implemented the legal requirement for PRP. We’ve been working on a developing a strong research-engaged culture for several years and increasingly we’ve been talking about methodology and evidence. Having joined NTEN, we’re finding that Lesson Study is offering us something much deeper than many lesson observations.
As I outline in this post ‘How I do know how good my teachers are?‘, the need to triangulate several sources of information over time has been in our thinking for a while. However, until now, we’ve still been wedded to the compliance culture of ‘OfSTED readiness’. No-one wants to get an inspection judgement below the level they feel they deserve; the stakes are too high to mess about with that. We’ve always adapted our school criteria to keep pace with the OfSTED criteria; we’ve been grading lessons and our lesson observation criteria have included things like evidence of differentiation, and (deep breath), rapid and sustained progress – within the lesson.
Well, things are changing.
Interestingly, during the week I had a tutorial session with one of my Y13 students, discussing Thomas Kuhn’s ideas about scientific revolutions and their relevance in the 21st century. This is the subject of his Pre-U Global Perspectives and Research essay. I’d suggest that Kuhn’s Paradigm Shift concept is relevant in education; it could be that we’re on the cusp of a significant system-wide shift right now. Certainly this seems to be true at KEGS.
These are the events last week:
Monday SLT meeting :
We decided to abandon grading of lessons. It’s been on the agenda for a while but we’ve finally decided to take the step. Our hesitation has come from thinking that if OfSTED do come and we’re not battle-ready, sharply aware of their criteria, we might drift – or at least some teachers might. However, we’ve all found that the grading has been problematic, not only in having certainty about what grade to give but also in the psychological impact of getting anything less than Outstanding. The setback costs outweigh any benefits of ‘telling how it is’. Even people who do get Grade 1 for a lesson risk thinking they have nothing to improve. Some people are sad that they won’t get the affirmation of a resounding Outstanding for their lessons – but it’s a price worth paying.
We also discussed the issue that, without grades, it is necessary to convey the scale of improvement needed when giving feedback. Two teachers may need to improve their questioning – but one might have a significant need relative to the other who may just need to tweak here and there to be better still. So, we have decided to work on a language of feedback that makes this clear. We will also accelerate our SDP objective to develop subject specific lesson observation criteria.
In order to gauge the overall effectiveness of teaching in the school, we will generate data for presentation to Governors and inspectors based on a rounded triangulation of lesson observations, examination outcomes and the more nebulous reputational knowledge that is gathered continually. A teacher who doesn’t appear to turn on the style in lesson observations but has a strong reputational standing and grinds out great results will contribute to our self-evaluation as ‘Outstanding’ overall. We’ll be less impressed with the show-pony who can’t deliver in the long run.
Later on Monday, our colleague Tim Worrall attended the seminar held at Teach First HQ with Prof Rob Coe, Mary Myatt, David Didau et al. His report-back confirmed the evidence base for making this decision. Quite honestly, I don’t know how OfSTED will continue to justify their current approach in the face of this evidence.
See these posts for some of the key source material:
Professor Robert Coe: Classroom Observation: It’s harder than you think
David Didau: The Cult of Outstanding
Tuesday Teaching Staff Meeting:
One colleague described this as the best staff meeting we’ve had for years. Why? Because lots of different people spoke and we were starting to share our practice in a fresh, open, almost confessional manner. It’s no longer unacceptable to stand up and tell people where things have gone wrong; in fact, it’s become just as meaningful as sharing successes.
The meeting consisted of a short intro from me. I set out the wider context of an emerging evidence-based culture in education. I stressed that our Teaching and Learning workshops – our action research groups – will need to report the evidence and methodology as well as the conclusions when we share our work in the next carousel session in May – our fabulous Market Place CPD. I stressed that reporting an inconclusive or negative outcome would be important to share.
Our Leading Edge Director, Jane Breen then set out a range of research methodologies and pointing to the book by Gary Thomas. (I’ve found that the Amazon preview covers quite a lot of material; worth a read.)
We then heard from colleagues reporting some of their experiences with data gathering as part of their CamSTAR research projects:
Delphine reported on her work on group work in French, stressing the need for triangulation. Her conclusions were that students’ perceptions can be wrong: they rated group work above working individually but she found this was based on their sense of enjoyment, not their learning. In the tests, they did as well or better working individually. Her message was to be open about what your data might suggest and to triangulate your professional perception with student feedback and assessment data.
Alex and Jane reported on the process of gathering student insights using post-its. Jane’s had worked well: students wrote responses to key questions about a learning process on post-its during a lesson; they could ask questions and she could direct them to probe further, leading to a useful set of comments. She felt the interaction during the data gathering process was better than a survey conducted online. Alex had over-cooked it. He’d asked 28 students to generate free-text responses to a detailed set of questions – leading to a massive 28x 25 pile of post-its! A nightmare to avoid.
Jenny described a fascinating project in history based on the idea that student bring a complex set of misconceptions into the classroom that they then share with each other in discussions. She wanted to capture this and had recorded extended group discussions about a particular topic during a lesson. The problem was that she ended up with 7 recordings (one for each group) lasting 45 minutes each. It was simply too much information. The process had potential and could be refined; the point is to generate data that is manageable without losing value. Recording a couple of groups for a much shorter period would still yield interesting information. (She did also say that there should have been 8 recordings – but one student insisted he could be in charge of the recording device but had messed it up.)
Tim and Paul gave presentations about using Survey Monkey and our own moodle VLE (KEGSnet) as means of generating surveys to get student feedback. Both systems generate automatic reports, are easy to use and don’t cost anything.
The final 15 minutes of the hour were given to us to gather in our research groups to consider how data use as we continue with the work. My group is based in the physics department; we are already gathering assessment data but we felt that a sample of student interviews would help to gain their perspective on the success of our test-retest initiative.
Thursday Departmental Review meeting with MFL team.
Finally, a lunchtime meeting with the MFL team was a useful opportunity to pull of these ideas together. As the new line-manager of this team the meeting was to plan the detail of the annual departmental review. It’s an interesting team where each member is regarded as a strong teacher but where the range of teaching styles is quite broad. They don’t always agree on what constitutes ‘good practice’, and, as a non-specialist, my task is to help them all develop without imposing my own biases unduly. We discussed the question ‘how do we know what works and what could be done better?’ – rather than ‘are we meeting the OfSTED criteria?’ It’s a new era.
We talked about gathering a range of information: looking at students’ work, checking progress against the plan of action that follows each set of exam results, conducting a student focus group and also observing lessons. It was a great opportunity to get the spirit of the lesson observations across: They are only one part of the process; we’re not interested in show lessons; we’re not giving grades but still we’ll need to be clear that some lessons will better than others; we need to be aware of the limitations of lesson observations and be conscious that our biases might lead us to make false conclusions.
Thursday: Middle Leadership Time
Instead of a large-scale meeting, we had another session of ‘Middle Leadership Time’. Essentially this allows middle leaders to collaborate with each other in a range of different ways. It’s part of a developing culture of professionalism that we’re becoming more confident with. Ultimately each person is accountable for the outcomes for students; these sessions enable people to work in ways that they feel support their work, free from external directives.
So, there you go. Paradigm shift in action. It’s early days but I’m confident that this is the right direction.
[…] http://headguruteacher.com/2014/01/19/evaluating-and-improving-our-practice-a-paradigm-shift/ […]
Reblogged this on The Echo Chamber.
Excellent post – thank you for leading on all of this so strongly at KEGS!
I am delighted SLT has at last reached the decision to drop grading lessons internally at KEGS. I replied to Alex Quigley’s post on this very point just today … A few of us were advocating this years ago in school. Its a little frustrating for some of us radicals at KEGS to have been told so often in the past that the grades were necessary …. I recall too many such conversations … the paradigm shift should have been allowed to happen before now! I am sure the lesson study concept can be the way forward for all – real teacher cpd sharing on lesson observations has been stymied for too long by top-down grading, in other words “performativity” has hindered growth through dialogue.
So, why not be even more bold and replace all top-down observations by a requirement for trios to do lesson studies, or something like it, as soon as possible. Lead on this, too!
Please draw your SLT colleagues to my thoughts,
“Free at last, free at last” …. what’s next to reform!
KEGS Head of English
Thanks David. I know we’ve been slow to arrive at this from your perspective. It’s been a question of moving forward with maximum confidence and unity. I hope Lesson Study trios will become the norm…but let’s evaluate the first round of trials first. Thanks for your input
Leading with confidence and taking bold steps can create unity quite quickly!
Reblogged this on The Tech-Enabled Educator Network.
Excellent post, as usual. I wonder if I can pick your brains.
I’m really interested in setting up Lesson Study in my school – the British Council School of Madrid. I have read the booklet from Lesson Study UK but, after reading it, I have some very basic questions about the practical details:
– It suggests 3 teachers in the research group but also suggests 3 observers in the research lesson. 3 observers plus the teacher delivering the lesson makes 4 in the group, no?
– Do you use the same students and class in RL2 and RL3? If so, does the same teacher deliver all 3 lessons or do members of the group take turns? If so, this means that a teacher would be delivering a lesson to a class that was not his/her own?
– Should the members of the research group be from the same subject or does it work best if you mix up subjects?
I hope you can help or at least put me in contact with someone I could chat to.
Hi Mick. I’m no expert, we’ve just started, but to answer your questions:
1. It is three people; one does the lesson, the other two observe.
2. The same people take turns to teach a lesson but it doesn’t have to be the same class- it could be if that works, but not otherwise. I don’t think there are hard and fast rules about this.
3. The planning of the lesson is key – and this therefore suggests people from one department. However, that need not be the case; some studies may focus on generic issues that are not subject specific. I’ve joined two Economics teachers to form a trio even though I don’t teach it – and, in this case, I won’t be doing one of the lessons; I’ll only observe.
Hope that helps.
To be honest, I had the impression that you had already abandoned the graded lesson observation…..
We haven’t used them since last year.. but, no, we hadn’t. Our Departmental Review has included graded lessons until now. We’ve questioned them and debated their validity.. a natural precursor to taking the step.
Tom – thanks for writing this up in the way you have. I’ve been doing some work on the meaning of paradigm shift and what it carries with it in terms of our thinking about knowledge. Your paradigm shift encourages this kind of thinking doesn’t it? What counts as knowledge? What’s the relationship between information and knowledge? The example you give of the enjoyment of group work is interesting. In the shifted paradigm the students perceptions could be seen as ‘right’ from the ethical/aesthetic (axiological) point of view (the researcher concludes they were ‘wrong’) and gives an insight into the different uses of the term knowledge. Were the students expressing knowledge rather than information? what did they mean by ‘better’? I’ve been thinking about knowledge meaning know-how (transitory/context dependent/hard to communicate etc.) which would be the definition of knowledge in your new paradigm and know-what to mean information/data. Having shifted you are collecting evidence in the form of know-how/knowledge – more difficult than collecting apparently concrete evidence about know-what as happens when the know-how of a lesson in action is reduced to a set of (matched?) numerical data. Jenny is right there in the messy world of qualitative research! I’m very interested in what you’re all doing and thinking. Maybe you’d have a look at my writing… nothing for sale, it’s all about ideas. In case you’re wondering…. I came up against this edge in researching for my Ph.D. as a trained biological scientist looking at what was going on in my Pupil Referral Unit, how to understand it in way that things might be improved. It’s a question of doing the right kind of looking! And to me paradigm shift doesn’t mean paradigm abandonment – numbers have their uses, change is happening all the time. Thanks again.
Geoffrey, thanks for such a thoughtful comment. It’s all so interesting – the interplay of values, evidence and definitions of progress, learning, doing better etc. I’ve had a quick browse of your blog. It’s great – I will follow with interest. This is a good starting point for anyone following the comments: http://www.solution-support.co.uk/2013/12/31/whats-your-paradigm/
Tom – I”m very happy to hear you’ve had a look at my writing. Teachers have been positioned outside this discussion I think – I’m not too interested in how it happened – and in my experience don’t necessarily feel fully entitled to join at the epistemology/pedagogy chat (in the UK at least, in other places it’s central to teacher training/practice/identity isn’t it?). I like the feeling of being included!
Geoff – P.S. There are a couple of new posts on my site that might interest you.
An interesting week! This is really useful to read as we are trying to refine our structures for observation and teaching improvement – it’s enormously helpful to see what other schools are doing laid out so clearly. Thank you for sharing this and good luck with the implementation…
Thanks Harry. We have further discussions to come and I will update this post as things develop. Good luck with your process too. Tom
[…] http://headguruteacher.com/2014/01/19/evaluating-and-improving-our-practice-a-paradigm-shift/ […]
[…] Evaluating and Improving our Practice: A Paradigm Shift. […]
[…] A post about our Paradigm Shift is worth reading here: http://headguruteacher.com/2014/01/19/evaluating-and-improving-our-practice-a-paradigm-shift/ […]
[…] Other schools already ahead of the game in activating groups of teacher-researchers: @headguruteacher with this post […]