This article was written for School Inspection + Improvement Magazine and first published in September 2016. You can read the full version on the SI+I website here.
The purpose of a post-mortem
Exam analysis meetings go by many names, most of them aptly funereal in tone, such as ‘post-mortems’ or ‘rapid improvement panels’ (RIPs).
One by one, middle and senior leaders step forward, heads bowed reverently, to get a grilling from a grim reaper in the guise of academy sponsors, school governors and headteachers who form part of the post-exam review panel.
The primary purpose of these meetings is to interrogate a school’s summative performance data, celebrating success where it occurs (recognising departmental improvements as well as individual accomplishments) and questioning underperformance or significant deviations from predicated outcomes in the hope that the same mistakes can be avoided next year.
The panel meeting
Leaders should prepare their data analysis reports in advance of the meeting and submit it to the panel for their consideration. Panellists should interrogate the report, highlighting key strengths and weaknesses, and annotating pertinent questions and concerns.
The meeting itself should focus on panellists’ questions rather than leaders’ presentations in order to try to ascertain more fully the reasons for certain outcomes and trends. There is no doubt that leaders can present their findings in a positive light, but what is needed is an honest account of the facts and an appropriate level of challenge, leading to an agreed set of SMART actions rather than vague promises.
The exam analysis report
Exam analysis reports should not be too long or descriptive. Rather, they should be succinct and evaluative in nature. Panellists want to know the following:
- What are the headline results per subject per year group and per cohort/class?
- What was attainment like versus what was predicted?
- What was attainment like versus what was targeted/expected?
- What was progress like versus what was predicted?
- What was progress like versus what was targeted/expected?
- What value did each teacher add (often presented in terms of residual scores where any positive figure shows value was added)?
- How did different groups of students attain and progress in relation to all students, including boys and girls, students in receipt of pupil premium funding, students for whom English is an additional language, students with SEND, and so on?
- What interventions (wave 1, 2 and 3) were put in place, when and for which students?
- What effect did each intervention have, what has been learnt about the value of each intervention?
- What was the accuracy and quality of teacher assessment like?
Once all these meetings have been concluded, the headteacher must collate and summarise the school’s performance in order to present it to the governing body, academy sponsors or executive head. At this stage it is worth a headteacher remembering that exam results are exactly that: results. They exist in the past tense and cannot be improved (with the exception of exam papers which are entered for remarks, of course; though under recent reforms this practice will become less common).
The only point of an exam post-mortem is to ascertain the ‘cause of death’, so to speak, so that appropriate action can be taken in the future in order to benefit the living.
As such, what a school’s stakeholders really want to know while they’re reviewing exam results is what led to those results: what worked and what didn’t; what lessons have been or can now be learnt.
Accordingly, here is some advice for headteachers and senior leaders who face inquisitions this autumn term from their executive heads, sponsors and governors.
Top five tips
- Present your data clearly, succinctly and honestly. Don’t try to mask your data by combining various qualifications. Although it might feel like it, it is not a witch-hunt and you will gain nothing by being in denial or being defensive. Moreover, you will fool no one by massaging your data.
- Keep track of which interventions are given to which students and analyse their relative effectiveness in light of the outcomes. You need to demonstrate value for money, so must evaluate the relative success of all your intervention strategies. This is not always easy but, in the case of the pupil premium in particular, it is important that you try because Ofsted and the DfE expect to see evidence of successful use of the pupil premium. Where you know a student has only been in receipt of one form of intervention, use him or her as a test base to compare the effectiveness of that strategy versus another.
- Identify which teachers achieved the highest value added scores (using L3VA, ALPs, ALIS, etc.). Decide how to employ your best teachers this year. This isn’t necessarily always with the top set or the C/D borderline class, especially now we have a 1–9 grading system and a focus on the progress of the majority not the attainment of the minority. Think creatively about each teacher’s particular skill-set, try to ‘think outside the box’ a little.
- Analyse how accurate your internal moderation proved to be. What more could be done to ensure that your teachers mark coursework and/or controlled assessments accurately? Also, analyse how accurately your teachers predicted their students’ outcomes and carry out a question-by-question analysis of the exam. Which questions proved the most difficult for students? What more can you do this year to better prepare students for that question? What support do your teachers need to help them teach those aspects of the syllabus better? Ensure that all this self-flagellation leads to clear and SMART actions against which you and your staff can be held to account.
- What professional development do your teachers need to help them improve? What other actions need to be taken to improve the performance of your team? Do any formal procedures now need to be invoked in order to tackle endemic underperformance or malpractice? Did you, as the headteacher/senior leader, challenge your senior and middle leaders and their teachers? Did you do everything you could to keep track of the progress of every student and take appropriate actions to intervene when it was needed?
To keep reading this article, please visit the School Inspection and Improvement website or subscribe to their magazine.
Follow me on Twitter: @mj_bromley