This has actually been a good month in the fight against data nonsense. First, we are hearing news from STA briefings that they are aware of the progress loophole of despair and intend to do something about it. What exactly they’ll do is anyone’s guess but i’m assuming a nominal score for HNM in cases where pupils fail to score on tests. Whether they’ll actually address the main issue of the unfairness of nominal scores for pre-key stage pupils (don’t you just love those whopping negatives scores for SEND pupils?) remains to be seen. But at least there is some movement there. Next we have the Ofsted march update, which informs us that inspectors will no longer be asking for predicted results (“it’s a mugs game” – Sean Harford). It also hammers home the point that there is no such thing as expected progress. And finally, relating to the above point about nominal scores, it urges caution when interpreting progress data; that inspectors must consider the effect of outliers and specifically mentions the issue of negative scores for pre-key stage pupils. This is all good stuff.
Stuck in the middle with you (the problem with prior attainment bands)
So, with good progress being made on these issues (pun intended) I thought I’d turn my attention to something else that winds me up: prior attainment bands. Not so much their existence but the varied and inconsistent ways in which they are defined. With RAISE on the way out, this is an opportunity to get things right next year. Well, we have to try.
Prior attainment bands – I’m talking about low, middle, high bands here; not the numerous prior attainment groups used to calculate VA – fall into two broad types: those based on average point scores at the previous key stage, and those based on prior attainment in the specific subject. VA uses APS whereas the old RAISE progress matrices were based on the prior level in the specific subject. Right now we have 3 main sources of school performance data (RAISE, Ofsted dashboard, and FFT) and we have 3 different definitions of low, middle, high prior attainment.
Ofsted Insepction dashboard
Things get confusing right away. Here we have two different definitions on the same page. For progress (the top half of the page), low, middle and high bands are based on KS1 APS, whilst for attainment they are based on pupils KS1 level in the specific subject. This means we have different numbers in, for example, the low group for progress than we do in the low group for attainment.
To clarify, the progress PA bands, based on KS1 APS are calculated as follows (and remember that maths is double weighted at KS1 so the fomula is (R+W+M+M)/4):
Low: KS1 APS <12
Middle: KS1 APS 12-17.99
High: KS1 APS 18+
Note that pupils who were 2c in reading, writing and maths at KS1 will have an APS of 13 and will therefore be in the middle band alongside pupils that were 2A across the board (APS 17). Furthermore, a solid 2b pupil (APS 15) will obviously fit in the middle band as will a pupil that was L1, L1, L3 in reading, writing and maths at KS1 (also APS 15).
Meanwhile, below in the attainment section we have the other low, middle, high definition based on the pupil’s level in the specific subject at KS1. Here, a pupil that was L3 in reading and maths, and L2a in writing will appear in the middle band for writing attainment measures due to their L2 in writing, but will appear in the high progress band due to their high KS1 APS of 20. Bizarrely, a pupil that is L1 in reading maths and 2c in writing will also appear in the middle band for writing attainment due to their L2 in writing, whereas for progress they will fit into the low band due to their low KS1 APS of 10. This is why it’s so important for schools to know who is in those bands. If you have bright red boxes around your attainment meaures (gaps equating to 2 or more pupils) this may be difficult to explain if all your pupils were 2A, but if they were 2c and L1 in other subsets, then it’s somewhat more justifiable.
Oh, and for KS1 of course, prior attainment is based on the pupil’s development in the specific early learning goal. One early learning goal out of 17 used to band pupils by. That can’t be right surely? Nice to see this get a mention in Ofsted’s march update, too.
And whilst we’re on the subject of banding pupils for attainment measures, once we introduce an element of prior attainment, doesn’t it cease to be about attainment and become a sort of pseudo progress measure anyway? Surely that’s just like the old progress matrices, isn’t it?
RAISE
Now things get even more odd. In RAISE, they take the same approach as the dashboard when it comes to progress with low, middle, high bands based on KS1 APS (see above). This means your progress data in RAISE looks the same as the progress data in the dashboard (just presented in a more incomprehensible format). However, when it comes to attainment, instead of adopting the subject specific method used in the dashboard, they stick with the progress approach based on KS1 APS. RAISE therefore presents attainment and progress in a consistent way with the same numbers of low, middle, high pupils in both parts, but this has caused a lot of confusion because the data differs between the two reports.
Elsewhere in the report we do have subject specific banding (based on pupil’s level in that subject at KS1) and to really ramp up the confusion we have results in, say maths, presented for pupils that were low in reading or high in writing and KS1. I’m yet to meet an headteacher or senior leader who gets the point of this. I’m not entirely sure I do either.
FFT
And finally we come to FFT. They also split pupils into low, middle, and high bands based on prior attainment but have come up with a third way. Like the Ofsted dashboard approach (well the progress one anyway) this starts with KS1 APS, calculated in the same way as the DfE, but then they do something different: they rank all pupils nationally by KS1 APS (600,000 of them) and split the pile into thirds. Those in the lower third are the lower attainers, those in the middle third are the middle attainers, and (yes, you’ve guessed it) those in the upper third are the higher attainers. It’s actually not quite thirds because if the 33rd percentile is smack bang in a stack of hundreds of pupils with the same APS, then they have to adjust up or down a bit I assume. This is why we don’t get 33% in each band nationally.
I rather like this approach because it means the 2c pupils end up in the low group and the 2A pupils move into the high group. In fact you even find pupils with the odd 2b lurking in the lower group. You will certainly have more lower attainers in an FFT report than you do in the Ofsted dashboard and RAISE, and you tend to see fewer middle attainers and a few more higher attainers too. Pupils just get distributed across the bands a bit more and this tends to make sense to teachers (once they have got over their exasperation of having to get their heads round another methodology).
One of the things that springs to mind is that term ‘most able’? Most able based on whose definition? The school’s? The DfE’s APS approach? Or perhaps their subject specific approach? And what about FFT’s too third nationally? Anyone have the answer?
This fragmented, confused and confusing approach can’t continue, and with the end of RAISE we have an opportunity to come up with a straightforward and logical approach to establishing these prior attainment bands. I prefer FFT’s approach but whatever we end up with, could we have some consistency please? At least not have contrasting methods on the first page of key report.
And we haven’t even touched on current Y3. Anyone know the average of EXS+WTS+GDS+GDS?
Over to you, people in charge of such things.