National Audit Office report finds test results “not good enough”

Hmm, I thought to myself on reading the headlines about this report, produced last month by the National Audit Office. So where’s the news?

On closer inspection of the report, it turns out that this review of primary maths teaching had come up with the not un-staggering conclusion that test results have hit a plateau, despite billions of pounds of investment.

Well, you don’t say? I am tempted to put in a Freedom of Information request to the NAO to try to find out how much money was spent producing this finding, which could have been reached by anyone who knows where to look on the Government’s statistics website (or reads the newspapers when the test result figures are published every year) and can lay their hands on spending figures from the Treasury. Yet the report says eight people were involved in producing it. As well as a literature review and a review of “national performance data”, the team visited and surveyed 28 primary schools.

Actually, there is a serious point here. There is supposed to be an accountability system governing public services. Documents such as this, by a body which reports back to the Public Accounts Committee, are a part of that.

But the form this takes, and the conclusions it reaches, can be remarkably shallow. For example, buried away in the report – it does have more interesting findings, which I’ll come to – is an observation, (again hardly surprising, of course) that schools are focusing on test preparation in year six. Schools are spending £1,000 to £3,000 on additional support staff for this year group, and 40 per cent of those surveyed “estimated that they spent more than 60 per cent of their teaching time preparing for the key stage 2 tests”. Ofsted and others, said the report, had suggested that there was too much emphasis on “intensive provision” in year 6, rather than developing more “lasting” styles of mathematics teaching.

The report even quotes approvingly a move by the National Strategies to address this by providing teaching materials focused on years 1-5, rather than year six.

Yet nowhere in the document is there any acknowldgement that there might be a conflict between concern about intensive test preparation and the finding, voiced much more prominently in the executive summary, that test results are not good enough and must improve.

Not only that, but, at the end of the report there is the acknowledgement that some secondary schools do not trust the primary test results. This scepticism does not extend to the authors of the report, judging by its main findings.

In fact, there is no attempt to cross-check whether the apparent position on maths standards documented by the test statistics is accurate. One glaring problem suggests itself immediately. The report quotes unquestioningly that the national numeracy strategy “helped to improve test results at key stage 2”. Yet it fails to address the fact that the biggest improvement in the test statistics, a 10 percentage point increase in 1999, occurred the year before the strategy was introduced. Since then, results have improved only slowly. Meanwhile, of all the three tested subjects, the largest improvements in primary test results since 1997 have occurred in science, where there was no comparable strategy.

Perhaps more fundamentally, the report does not attempt to find any alternative measures. Do the test results tally with other evidence? Yet plenty is available, not least from international testing surveys. The TIMSS report, an international study of test attainment among 10- and 14-year-olds in 59 nations which sets its own assessments, concluded the latest of its four-year studies this week, with findings showing that English results for maths had improved faster, between 1995 and 2007, than any of the countries which had taken part both years. Of course, this finding was not available to the NAO team. But they could have looked at the previous report, from 2004, which also showed English primary pupils’ maths scores improving dramatically in the years since 1995. These results also should be scrutinised more closely, of course. But they do seem worthy of consideration.

The impression of the report, though, is that no checks were needed since the national tests provide the only measure that matters. Not only that, but performance in the test is simply taken in the report’s main findings as synonymous with mastery of mathematics, as witnessed by the use of the phrase “performance in primary mathematics”, without the need to specify that this relates to test scores.

I could go on – I am tempted also to criticise the report for its uncritical use of Government jargon, including phrases such as “step change” and “regional field force” (eh?) – but I better not or we’ll be here forever. There are interesting nuggets later on in the report, including a suggestion that less able pupils can become confused by learning maths in different ways in the school and in the home, but these tend to get overwhelmed by the emphasis on the national statistics as, effectively, the be-all-and-end-all measures of teaching quality.

The contrast with, for example, the multi-layered approach to understanding what is going on in primary schools undertaken by Robin Alexander’s Primary Review, or the depth of recent reports from the Children, Schools and Families Select Committee, is stark. If I were giving this NAO report a numerical rating, it wouldn’t be high.

Leave a Reply

Your email address will not be published. Required fields are marked *