Academy Sponsor Performance Tables
This week, at the same time as releasing the raw SATs scores for primary schools, the government and a number of other organisations released performance data on academy sponsors.
The Education Policy Institute, which is a think tank set up by the Rt. Hon. David Laws released its School Performance in Multi-Academy Trusts and Local Authorities Paper
The Sutton Trust, which is a think tank set up by Sir Peter Lampl released the latest iteration of its Chain Effects Report
And the DfE itself released its own MAT Performance Tables
If you click on the links you can read each report but the first and the last rate the Elliot Foundation as a ‘not significantly above average’ performer. The Sutton Trust does not include us in its analysis because we don’t have any secondary schools.
The schools in the Elliot Foundation are currently three times more deprived than the national average and at the same time three times more likely to be judged ‘outstanding’ by OFSTED. If this is true then you might wonder how we can be rated as ‘not significantly above average’ by these league tables.
The short answer is it’s complicated.
The longer answer is that the measure of success is heavily influenced by a number of things.
But first a statement of support. The statistics team at the DfE work hard to present truthful and meaningful information in a complex and politicised arena. It is immensely important in a school system to attempt to separate those who have genuinely improved teaching and learning from those who took credit for something that would have happened anyway and those who were just lucky. We do not want to be gambling on our children’s future. The Elliot Foundation has been lobbying for greater transparency in the academy arena for the last four years and we welcome the publication of this work. Over time as the tables are updated each year and the sample sizes get bigger, it will become much easier to identify those organisations who are best serving their children and communities.
The only things we would add are some notes on the sensitivities of the data
The scores should be read in light of the context data for each school; to some extent this is included in the score as schools in more deprived communities will tend to have a lower Key Stage 1 score ‘on entry’
The scores are derived from each academy’s ‘value added’ score, which is rightly deemed to be a fairer measure of school performance than simple brute attainment at the end of year 6; this stops a school serving an engaged middle class community being automatically considered to be better than a school serving a deprived community
But parents should note that it only measures the performance from the end of year 3 to the end of year 6; it completely ignores the really hard yards of early years teaching where for example the most challenged schools often do their best work
‘Value added’ as a measure is very vulnerable to school size; a One Form Entry primary school may have only 25 children, so one or two children having a bad day on the test make a big difference to the school’s overall score
The overall Trust score is exceedingly sensitive to the date at which the school becomes an academy. It is not measured from the actual date at which the school becomes an academy but from the first September 8th after the school becomes an academy. So if a school joined Trust X on October 1st 2014 the first set of results that would count to the MAT would be those from May 2016.
This impact on the trust score is lessened as trusts grow. They will effectively benefit on some occasions and lose on others. But when trusts are comparatively small and have a number of one form entry schools it can significantly influence the scores.
Hugh Greenway, CEO The Elliot Foundation, July 8th 2016Back to News