SATs and Zombie Statistics

Here's today's Independent story by Alison Kershaw (story link):


Hundreds of children had their national curriculum test results changed or cancelled last year following cheating or mistakes in administering the papers, a Government report shows.
Increasing numbers of cases of "maladministration" were reported to the Standards and Testing Agency (STA) following last summer's tests, according to research.
The report shows that, in 2012, 370 cases were reported, up from 292 in 2011 - a 26.7% rise.
There were 168 cases reported in 2010, although it is thought the figures may have been lower that year due to a boycott of national curriculum tests by around a quarter of primary schools, while in 2009 the figure was 346.
The figures cover Key Stage 2 national curriculum tests - known as Sats - in English and maths which are taken by 11-year-olds in their final year of primary school, English writing and science sample tests, and higher Level 6 papers which are also all taken by 11-year-olds, and the Government's new reading, or phonics, test for six-year-olds.

Four ways this story has zombie statistics (ones that are just dodgy, and destined to rise again even when debunked):

1) 370 cases. Sounds a lot. But how many children take these tests? There are 600,000 students in a yeargroup, so 370 is 0.06%. Here is that context again. That's far less than one child in a thousand.

2) But this was 370 cases. Not children. Each child takes 4 exams, three English, one Maths, plus some take extra English and Maths, plus some take Science and English sampling papers. Even as a low estimate, that's two and a half million papers. So 370 cases out of those is now far less than one paper in five thousand.

3) But did you notice something else? "The Government's new reading test for six year olds". That's right, this year the number of children doubled! So if the number of cases increased, there might be a reason for that, especially as it was the first time the controversial new test had been administered by schools. Remember the Independent's mealy-mouthed putting "maladministration" in quotation marks disguises that in many cases these could have been innocent mistakes, not teachers helping children, but some procedures being messed up.

4) And last but not least, our old favourite, the percentage increase that means nothing. A 27% increase of a tiny amount is still a tiny amount. If wearing red flippers doubles your risk of a shark attack, it might be unwise to do it, but it doesn't mean thousands more people are at risk of shark munching. [It doesn't, by the way.] So 27% increase from 292, when a few years ago it was almost as high as today, is not evidence of a shocking rise, but simply year-on-year fluctuations as schools, teachers, children and most of all exams keep changing.

Full disclosure:

In the full version of the story (in the Independent newspaper rather than the 'i' version that I read at breakfast) you also get these paragraphs, tucked away in the "so boring you won't get this far" bottom of the piece.

In total, 584 pupils had their results changed or annulled last year, the report says, adding that this represents less than 0.1% of pupils who took part in the Sats tests.
A Standards and Testing Agency spokesman said: "Results were amended at only 58 primary schools - that amounts to fewer than 0.5% of the thousands of schools where pupils took the phonics check and the Key Stage 2 tests.
This is more like it, but it is relegated to the bottom and will not be seen by the majority of shocked and appalled school-bashers.

For lots more zombie statistics, in podcasts from a downloadable database of statistical chicanery, visit the BBC More Or Less website, or follow Tim Harford or David Spiegelhalter on Twitter.

Popular Posts