Tuesday, May 4th
The more time I spend in education journalism, the more regularly I find myself surprised by the amount of misinformation which flies around in support of particular views.
The latest example can be found in yesterday’s Guardian, in an article by Peter Preston, its former editor, in which he criticised members of the National Association of Head Teachers for the union’s forthcoming boycott of this year’s key stage 2 tests. I have already written a reply to a previous piece by Mr Preston, in which he pledged support for the national tests in a piece defending targets across the public sector. This was published in the Guardian’s Response column last October.
But I wanted to write something, also, on this latest piece, as it again features some misconceptions, as well as some basic factual errors.
First, he says that, because of the refusal of probably thousands of heads to take part in the tests in English and maths this year, there will be no “useful league tables available” to parents wanting to choose a school.
Well, it is a moot point whether the current league tables are “useful”, at least in performing the function they are supposed to serve: helping families guage the quality of a school. As anyone connected with education must surely know, the biggest influences on schools’ league table positions are the characteristics of the pupils, rather than anything the school does. Unadjusted data say little about teaching quality and their use to suggest simple comparisons can be made is highly misleading. Also, of course, it will still be possible to construct league tables this year, as teachers’ own judgements of pupils’ performance are being published.
Second, he says that “no reliable guide on national performance” will be available, with Sats not taken in a proportion of schools. That statement should not be true, this year or in the future. This year, even if only half of pupils were able to take the tests, it should not be beyond the wit of government statisticians to use this large sample to generate an overall picture of national standards.(They will be able to work out whether those not taking the tests are from, on average, high- or low-performing schools, and adjust the total results from those who do submit scores accordingly.) In future, of course, it would not be necessary to have every pupil take a test in order to generate national data; in fact, setting only a sample of pupils a test, with no school-by-school results generated, is precisely the approach being taken with regard to generating national data on science performance this year. It has also been a feature of other countries’ measurements of national standards for decades.
Third, he writes, “the only victims of this melee are 11-year-olds who’ve worked hard at reading, writing and maths for years – and now won’t know where they stand”. This, I would submit, is utter nonsense unless one holds that, without the current system of national testing and monitoring, teachers would be unable to decide for themselves what level each of their pupils has reached. While schools whose heads are taking part in the NAHT action will not generate test results to send to the government, it is far from true to say that this means that pupils will not know their levels. Schools which want to put their pupils through a test can set them a past paper. This result could be used to generate a level for a pupil. Or it could inform teacher assessment. Or the school could choose not to set a child a test, but still give them a level based on their teacher’s own judgement of their progress over four years. I would be surprised if any schools respond to the boycott by not telling parents how their children are doing. Mr Preston may feel this information is “tainted”, because it does not go through an external marking process overseen by the government. If that is so, he should just say that he doesn’t trust teachers to reach their own judgements on how pupils are doing, or to form a professional view on what kind of assessment experience is best for the child.
The fourth point is more basic. Mr Preston says that “standards in Scotland and Wales have slipped since testing there stopped”. Erm, Peter, there never has been an English-style testing system in Scotland, so I’m not sure where you’re getting this information. In Wales, it is also unclear where your information comes from. The Principality did indeed fall slightly in the last major testing survey in which it took part. (The Trends in International Mathematics and Science Study, or TIMSS). But that performance can say nothing about the effects of Wales’s decision to scrap English-style national tests in 2005 TIMSS tested two cohorts of children: 10-year-olds and 15-year-olds. The 10-year-olds had yet to go through year six, which is the year most affected by pre-test preparation under the old, English-style, system. The 15-year-olds went through the old, English-style tests so their performance can say nothing about Wales’s decision to adopt a different model.
Really, this debate has to be better informed than this.