The Zero Game

Until this year VA has played second fiddle to the levels of progress measure. This was mainly because there were no floor standards linked to VA. But it was also because – let’s be honest here – not many people really understood it. Everyone understood pupils needing to make two levels of progress (and hopefully three levels of progress) but we struggled with the world of prior attainment groups, estimates, and confidence intervals. But now that levels are gone and all we’re left is a value added progress measure, we have no choice but to get our heads round it. So, we read the primary accountability document and have seen the lookup table on p16-17; we understand there are 21 prior attainment groups (a reduction in start points in previous years due to change in methodology); that each of these prior attainment groups has an estimate in reading, writing and maths, which represents the national average score for pupils in that group; that these estimates form the benchmark for each pupil; and that exceeding these scores ensures a positive progress score for each child, which will aggregate to a positive progress score overall. We get this now.

And that’s where the trouble started. 
Up until recently, schools were flying blind. With a new curriculum and new tests, unsure of what constituted expected standards, and no idea of ‘expectations’ of progress, schools just concentrated on teaching and tracking the gaps in pupil’s learning. We even started to question the methods of our tracking systems, with their pseudo-levels and points-based progress measures. Things were looking positive. The future was bright.
 
But then we saw the checking data, and that lookup table appeared, and I produced my VA calculator, and FFT published their 2016 benchmarked estimates.  Now it seems that many schools are playing a VA game, working out where each pupil needs to get to in order to ensure a positive progress score; comparing benchmarked estimates (that are no doubt too low for next year) against predicted results to model VA in advance, to provide figures to settle nerves and satisfy those scrutinising schools’ performance.
I understand that schools want a positive VA score when the stakes are so high but we have to consider the potential risks to pupils’ learning by focussing on minimum ‘expected’ outcomes. I am particularly concerned to hear that schools are building systems that track towards these estimated outcomes, using teacher assessment or optional tests as a proxy for expected standards, as a predictor of outcome that can then be compared against the end of key stage progress estimate. I think of the ideals of ‘learning without limits’ and the sound principles for the removal of levels, and wonder if anything has really changed. I also wonder if it was wise to publish my VA calculator. All those schools inevitably using it to generate estimates for current cohorts; estimates that are being entered into systems and somehow tracked towards. Am I part of the problem? 
Has a knowledge of progress measures become a risk to children’s learning? 
How about we just put the blinkers on and concentrate on teaching? Look after the pennies and the pounds will take care of themselves. 
Just a thought. 
© 2024 Sig+ for School Data. All Rights Reserved.