A summary of my recent analysis on the data around the SNP's performance on education appeared in
today's Daily Record, along with a broader perspective and some caveats (which generally I agree with) by James McEnaney
Unfortunately one graph appeared mislabelled in print (graph 5 is "writing" not "numeracy") and the
online version get's a little confused with graphs and sources - so (with some very minor tweaks) the below is the text of the article with full source links, exhibits we didn't have room for in the paper and references to James' caveats.
*********
Judge SNP on Education? They Flunked it
"Let me be clear – I want to be judged on this. If you are not, as First Minister, prepared to put your neck on the line on the education of our young people then what are you prepared to. It really matters."
Nicola Sturgeon, August 2015
Nicola Sturgeon has asked to be judged on her record on education and the SNP has been in charge of our fully devolved Scottish education system for roughly 10 years now. Responsibility for the educational outcomes we're now seeing must lie with the SNP - and the trend in those outcomes is simply terrible, as Cabinet Secretary for Education and Skills John Swinney has had to admit.
"the statistics also show a drop in writing performance for S2 pupils which is of particular concern [..] the SSLN [literacy and numeracy] statistics are disappointing"
John Swinney, May 2016
"The figures for Scotland do not make comfortable reading [..] compared to 2012, our performance in science and reading has fallen. In science and maths we are now below the levels at which we performed in 2006, and more countries have outperformed Scotland in all three areas than at any time since PISA began." John Swinney December 2016
The Evidence of Falling Standards
Scotland participates in one international benchmark survey on educational attainment, the three-yearly
PISA study1. PISA surveys 15-year-old pupils, so the most recent data (for 2015) was the first chance to see how pupils who had spent significant time in an SNP-run education system were faring, as these pupils had been in the Scottish education system for 7 years under an SNP administration.
Since the SNP came to power, in each of the three main disciplines as measured by PISA (mathematics, reading & science), Scotland's performance has declined significantly. This is true in absolute terms, relative to the whole of the UK and relative to the average for all OECD countries.
In maths and reading, where Scottish pupils used to perform significantly better than UK students overall, latest available data shows we've dropped back to average at best. In science, where Scottish pupils used to perform in line with the UK average, their performance in the most recent survey is very significantly worse. In all three disciplines, we've gone from out-performing the OECD average to being - well - distinctly average.
If were to you look at our performance in international league-table format: in all three subject areas we've dropped below the UK and OECD averages and into the bottom half of the table
2.
As the
Scottish Government themselves make clear,
"PISA is the major international study of pupil performance in which Scotland participates" - it is their chosen method of benchmarking our educational attainment.
In fact, having withdrawn from the other globally recognised assessment body (
TIMSS & PIRLS) in 2011, it's now the only data we have to compare our performance on an international basis. This is good quality data and the results are a damning indictment of the SNP's performance as custodians of our education system.
Caveat: James McEnaney argues that PISA data "is certainly not without its problems", concluding "
although PISA has done good things for global education [..] it is a very long way way from being perfect data". I don't dispute that the data is far from perfect, but it is the only international benchmark data we have and the trends observed are consistent across all three subject areas and broadly consistent with the directional message from the SSLN data below. I don't think the data can be ignored (and don't think James is suggesting we ignore it).
The other way we have been able to objectively assess the performance over time of the Scottish education system is the Scottish Government run
Scottish Survey of Literacy and Numeracy3 (SSLN).
This data provides worrying evidence of declining performance and wide (and in some cases widening) Attainment Gaps between the performance of pupils from the most deprived and least deprived backgrounds.
To illustrate, let's looks at numeracy performance and the Attainment Gap for P4 (7-8 year-old) pupils. The graph clearly shows standards are dropping and the Attainment Gap is growing.
Another striking example is writing performance for S2 (13-14 year-old) pupils. Here the graph shows overall performance is declining dramatically and the Attainment Gap remains disturbingly large.
There’s a lot more in the SSLN data
4. For example it also shows not just a worryingly large Attainment Gap, but also that only 40% of S2 pupils are judged to be performing “well” or “very well” in numeracy. Is our education system producing a generation, soon to vote, who aren't great at adding up?
Caveat: James McEnaney rightly points out that the S2 numeracy data is judging pupils against S3 student's benchmarks, so I guess my cheap line on "not great at adding up" is rather harsh. He also flags concerns with the writing assessment methodology. He knows the data (and the subject) better than I do, so I'm in no position to question his judgement on that.
He does however go on to say
"problems with some of the information don't change the downward trends. The SSLN data certainly suggests that the gap between the most-deprived and least-deprived children is, at best, not being closed. It also raises real concerns, shared by teachers, about literacy and numeracy in schools". I think that's a fair summary of the point I'm trying to highlight with this analysis. The full set of graphs is listed under note 4 below - so you can judge for yourself.
But there’s no danger of the SNP being embarrassed by the next set of SSLN data, because – incredibly - they’ve simply decided to stop the survey altogether. They justify this by saying that, because it’s survey based, the data doesn’t provide useful information at a school or local authority level. That’s a frankly ridiculous argument, because that was never what the survey was intended to do.
SSLN gives an overall assessment of the quality of literacy and numeracy education in Scotland, and it does that well. The Scottish Government commissioned the OECD to write a report "
Improving Schools in Scotland: An OECD Perspective". That report cites the SSLN data fully 27 times and at no point suggests the data is in any way not fit-for-purpose.
It's hard not to conclude that the SNP's decision to stop the SSLN survey is motivated by the fact that it exposes failings on their part - and that they're fearful of what the next ones might show. Do you think they'd scrap it if they thought the upcoming surveys were going to show how successful their reforms have been? To coin a rather poignant phrase in this context: you do the math.
It gets worse. The Scottish Government states
"New statistics on literacy and numeracy performance will be available annually from the teacher professional judgement data collection". So not only are we losing the ability to track performance versus prior years on a like-for-like basis, we're moving from largely objective test-based measurements to subjective measures
"based on teachers' professional judgements". It's almost as if the SNP are going out of the way to avoid being judged objectively on
their performance.
Combine the SSLN findings with the PISA results and a clear picture emerges: on the SNP's watch, overall standards in Scottish education have dropped and pupils from the most deprived areas are the ones suffering the most.
The Case for the Defence
When grilled on this topic by
Andrew Neil during a recent BBC interview, Sturgeon offered a master-class in political spin. She brushed off references to the latest available international benchmark data (reflecting 7 years of SNP control) as
"from two years ago" and airily dismissed her own Government's regular survey of over 10,000 pupils as somehow irrelevant.
She even went on to argue that "
there's real progress been made". To justify this she fired out a barrage of stats on exam passes and “tariff scores”, both crude measures which suffer from obvious weaknesses when used to judge the performance of our education system. There is no robust way to adjust for the issue of “grade-inflation” (lowering pass marks) and the mix of exams taken, but more importantly these are measures of the end of the educational pipeline. If, for example, there are issues with numeracy at P4 (as the SSLN survey suggests), we need to know about it now, not wait until those pupils leave school in 8 or 9 years’ time.
The other measure seized on by Sturgeon is the number of pupils leaving school for "
positive destinations”. This basically means not ending up unemployed or in jail - so if your kids leave school for a minimum wage job on a zero-hours contract, our First Minister will celebrate that as having achieved a “positive destination”.
Who's Really to Blame?
The SNP would of course like to blame “Westminster austerity” for these failings, but that doesn’t wash. The PISA scores show this is an issue specific to Scotland.
The reality, as revealed in the Scottish Government’s own GERS figures, is that the SNP have cut spending on education & training in Scotland over the last 9 years by on average 8% more than it's been cut in the rest of the UK. That was a political choice.
Given the education & training budget has to fund expensive flagship policies like free university tuition, it's not surprising that schools have suffered badly over this period. Primary schools have been particularly badly hit
5.
Action has only recently been taken to redress the balance, but the SNP’s response is too little, too late for the generation that has been disadvantaged as a result of education spending being low on the SNP's priority list.
Over the period the SNP have been in power, teacher numbers in Scotland have declined by 8% and class sizes have increased by 6% - but the trend in England is reversed, with both teacher numbers and class sizes improving
6.
The data is clear: the SNP - not “Westminster austerity” - are to blame for failures to invest appropriately in our Scottish education system.
Caveat: James McEnaney rightly argues that - while the data is valid -
"we mustn't lose sight of the that fact that it is just one part of a very large and almost uniquely complex picture". I don't disagree with that: wider socio-economic issues are certainly a factor, in particular the broader issue of poverty. I agree with James when he says
"the big picture of Scottish education is about much more than just the failures of Nicola Sturgeon and Alex Salmond". It's about more than the SNP's bad choices around specific issues relating to education - but those bad choices are a significant factor
Reflection
There can be few greater responsibilities for a party of government than to ensure its country's children receive the best possible education, to provide the next generation with the best chances to succeed in life. We know that for the SNP, educating our children will never be their top priority because - as Sturgeon herself and the party's constitution make clear - independence transcends all.
Tony Blair swept to power in 1997 promising to make "education, education, education" his top priorities in office; the Scottish education system appears to be suffering the effects of having an SNP government in power with "independence, independence, independence" as theirs.
Unfortunately it's young Scots emerging from education with reduced life-chances who are the ones paying the price for the SNP's failings today. The question is, will the SNP start paying the price themselves at the ballot box tomorrow?
*******
Notes
1. Programme for International Student Assessment (PISA)
The global benchmark standard for education is provided by the OECD's PISA:
The Programme for International Student Assessment (PISA) is a triennial international survey which aims to evaluate education systems worldwide by testing the skills and knowledge of 15-year-old students. In 2015 over half a million students, representing 28 million 15-year-olds in 72 countries and economies, took the internationally agreed two-hour test.
As the
Scottish Government website explains:
PISA seeks to measure skills which are necessary for participation in society. Accordingly, it assesses how students apply the skills they have gained to the types of problem they may encounter in work or elsewhere. Pupils are assessed at the age of 15 as this is regarded as a reasonable point at which to test the impact of compulsory education throughout the developed world (most PISA 2012 participants in Scotland were attending S4).
I have used the PISA "
data explorer" tool to download and play with the data that's available as a consistent time-series. Fortuitously when it comes to judging the SNP's record, this means our data set starts in 2006, the year before the SNP came to power.
The data is survey based and provided as score averages. As with all good statistical analysis, standard error figures are given. This means our graphs can include vertical bars to show the range within which we can be 95% confident the true figure lies.
A minor irritation is that I've not been able to clarify if the UK data includes Scotland - by implication it does, but either way it doesn't change the overall message (in fact at worst if UK does include Scotland then it dampens the severity of the observed relative trend).
More detail on Scotland's PISA results can be found on the
Scottish Government website.
2. PISA Rankings
3.. Scottish Survey of Literacy & Numeracy (SSLN)
As the
Scottish Government website explains:
The SSLN was a sample-based survey which monitored national performance over time in literacy and numeracy at P4, P7 and S2. It provided a snapshot of Scotland's achievement in literacy and numeracy at a specific point in time and allowed for comparisons over time to be made. The information from the survey informed the development of dedicated resources to facilitate improvements in learning and teaching.
Whilst the data is survey based, it qualifies for National Statistics certification and rigorous statistical techniques are used to to ensure the data is presented with meaningful confidence intervals. As the
latest report explains in helpful detail
As in all sample surveys, as the SSLN is based on a sample of pupils rather than on the
whole population, the results shown are estimates. Therefore there is an element of
uncertainty within the results because the pupils sampled may not reflect the population
exactly.
Uncertainty around the results is estimated using standard errors. Standard errors are a
measure of the variation in the data i.e. how each observation differs from the mean. As
the SSLN sample design is not a simple random sample - pupils at small schools have a
higher probability of being selected than pupils at large schools - this means that standard formulae used to calculate the standard error from a simple random sample would not be appropriate. Standard errors are therefore calculated empirically using the jackknife procedure.
Standard errors are in turn used to produce confidence intervals around the estimates.
Confidence intervals show the range of values within which one can be reasonably
confident that the actual value would lie if all pupils were assessed. Ninety-five per cent confidence intervals for the main national estimates were calculated and were around ± two percentage points. This means that the true value of each estimate is likely to lie within two percentage points either side of the given estimate.
Where appropriate, confidence intervals are represented on charts by error bars to help
demonstrate this level of uncertainty. Where the estimates are different but the error bars
overlap we cannot be sure that the true values of each estimate are statistically
significantly different from each other. Significance tests (t-tests) are used to assess the
statistical significance of comparisons made.
4. Full set of SSLN graphs
Reading
Writing
Numeracy
5. Data on Spend/Pupil
6. Data on Teacher Numbers
There's an excellent
fact-check by Ferret Scotland on this topic which shows teacher numbers in Scotland have declined by 4,000 since the SNP came to power - that's a 7.5% decline.
The most recent
stats on school teacher numbers for England only go up to 2015: over the 2007 to 2015 period they show FTE "regular teacher" numbers
increasing by 4%, or nearly 16 thousand teachers. The same period in Scotland saw teacher numbers
decline by 8%.
Looking at pupil:teacher ratios back to 2007 takes a bit of work.
Summary Statistics for Schools in Scotland shows the data from 2010 and shows pupil:teacher ratios increasing (ie getting worse, as we might expect given the scale of the teacher number decline)
It's a bit of a faff to go back to 2007, but I've done the work so here it is:
I've struggled to get fully comparable stats for England, but
this data suggest (as we'd expect from the teacher number stats) that pupil:teach ratios (PTR) are decreasing (improving) in England. This table may be hard to read, but it shows
- LA Maintained Primary PTR decreased from 21.9 in 2007 to 21.0 in 2015
- LA Maintained Secondary PTR decreased from 16.5 in 2007 to 15.8 in 2015