Friday, March 25th, 2011
I have been interested in two debates in English education for several years now.
One starts along the lines: “Standards are not high enough. We need to hold our schools to account properly so that they improve exam results for all young people, who so desperately need better grades. We also need to use results data to target our efforts to help pupils do better.”
The other says: “Schooling driven by performance indicators is creating a whole host of negative consequences, which go to the heart of pupils’ educational experiences.”
I often feel like these debates take place almost in parallel, with very little acknowledgement of how they are inextricably linked, and little attempt at communication between the two. Frequently, it seems as if people accept the force of either argument, without realising that they are so closely related as to be almost two sides of the same coin.
This has been going through my head again in recent weeks, as I come across yet more evidence supporting the second of those statements. As always, it’s not necessary to look very hard for this stuff; it just keeps coming.
Exhibit one was a report on history teaching in English schools by Ofsted.
Now, it’s important to get the context right here: the Ofsted report went out under a press release headline: “History a successful subject”, and offered plenty of support for the way the subject is taught.
For example, it said “history teaching was good or better in most primary schools” among those inspected in this programme from 2007 to 2010, and that “history was successful in most of the secondary schools visited because it was well taught, notably in examination classes at GCSE and A level”.
In secondary schools, it said: “the large majority of…history teachers were very well-qualified. In the large majority of the schools visited, the quality of the provision also reflected the strong leadership of the history departments”.
It added: “The subject knowledge of the specialist history teachers in the secondary schools visited was almost always good, often it was outstanding and, occasionally, it was encyclopaedic. Inspectors found so much good and outstanding teaching because the teachers knew their subject well.”
These central findings might have been lost on readers of some media coverage when the report came out.
So we are talking about a generally successful subject, taught by enthusiastic professionals. However, within that generally positive context, there were several instances offering more evidence underlying the dangers of “Education by Numbers” – ie exam-results oriented practice being adopted which, the report would suggest, is bad for underlying learning.
First, probably the starkest negative finding in the report related to exam-endorsed textbooks. Ofsted said:
“In recent years, more textbooks have been written specifically for the examination course specification, both at GCSE and A level. The textbooks, often written by the chief examiners for the courses, are generally endorsed by examination boards and gain the status of a ‘set text’. The history teachers in the schools visited were well practised in supplementing these with additional materials as necessary. However, it was clear that, at A level, the mushrooming of course-endorsed and linked textbooks was having a negative impact. They stultified teachers’ thinking and restricted students’ progress. The weaker students relied on the textbook as being sufficient preparation for the external examinations and were less willing to read beyond the ‘set textbook’. Their written and oral work revealed how their understanding of the topics they studied was narrowed. It also meant that students were not as well prepared to meet the challenges of higher education where independent learning and extensive reading were required.”
Damning stuff, I thought. To put this in almost-punchy soundbite terms: “Exam-endorsed textbooks – in the way they are sometimes used – are stultifying teachers’ thinking; restricting students’ progress; narrowing their understanding”.
Second, the report offered more evidence of pupils being steered away from history because of “league table” (I put it in quotes here because this is often a shorthand, I think, for wider hyper-accountability/results pressures) concerns. It said: “In some of the schools visited the students were restricted in their subject options at GCSE and some had been steered towards subjects which were seen to be less demanding than history.”
It continued: “Entry level…is intended for students who find GCSE too demanding. However, the declining number of students taking this examination reflects not only a lack of confidence that entry level meets the needs of those for whom it was intended, but also decisions by curriculum leaders to avoid a course that does not contribute significantly towards their school’s attainment profile”. [my italics].
Ah, OK, so this could be summarised: “Pupils steered towards certain subjects because of a school’s need to improve its figures”.
Third, the report offers criticisms of the move towards completing key stage 3 in two years, rather than three. I think this is related to the themes of “Education by Numbers” because I believe some of the calculation of schools in making this move is to increase the time they have to focus on raising GCSE performance. That, of course, can be seen as a rational move to make, from the individual pupil’s point of view, if it helps secure better grades. But hyper-accountability/ “league table” calculations are also likely to be a factor. This means, of course, that a child not going on to study history GCSE will stop at 13, rather than 14. And Ofsted’s criticism of this is on educational grounds, although admittedly from a subject-specific point of view, that of a specialist history inspector.
Among the quotes on the report on this are: “The national curriculum orders and programmes of study in Key Stage 3 have led to much high-quality teaching and learning in history. However, in one in five of the secondary schools visited, curriculum changes, such as the introduction of a two-year Key Stage 3 that allowed some students to give up history before the age of 14, and thematic approaches to the curriculum, were associated with teaching and learning that was no more than satisfactory.” [I know the use of “no more than” satisfactory will jar, at least to a teacher audience, since my dictionary defines satisfactory as “fulfilling expectations or needs”, but you get the point].
“In 14 of the 58 secondary schools visited…whole-school curriculum changes [including a two-year KS3] were having a negative impact on teaching and learning in history at Key Stage 3.”
It goes on: “In England, history is currently compulsory for students beyond the age of 14 and those in schools offering a two-year Key Stage 3 course can stop studying history at the age of 13. England is unique in Europe in this respect. In almost all the countries of the European Union, it is compulsory to study history in some form in school until at least the ages of 15 or 16.”
The report also points out that children do not get access to specialist history teachers in primary school, meaning that some will only have specialist teaching in the subject for two years of their school careers.
So that would be: “some pupils are only getting two years’ history teaching in secondary school, and this is not a good thing”.
Fourth, there were problems with teachers’ professional development – because of an over-reliance on exam-specific training – in some schools. “Access to training for history was an increasing concern for all teachers…In 28 of the 64 secondary schools visited in which this aspect was specifically inspected, access to subject training was only satisfactory and in 10 of the schools it was inadequate. In one in every five schools visited, training by the examination board was , and had been for several years, the only type of out-of-school subject-specific professional development for history teachers.”[my italics]
It goes on to say that this training, in schools doing it well, was only one of a number of approaches.
So that would be: “In one in five schools, teachers’ only professional development is geared to teaching towards particular exams”.
Fifth, the inspectors found that, in “a minority” of primary schools, foundation subjects such as history had been “squeezed”. “In year 6 in particular, teachers said to inspectors that the foundation subjects were ‘not a priority’”. Year 6, of course, embraces the run-up to key stage 2 tests.
So I think it’s fair to infer the following statement from this: “Some primary schools neglecting subjects such as history in drive to raise test scores in English, maths and science.” Ofsted points out, of course, that some schools don’t do that and still get good results, but I don’t think the statement above is unfair.
Ok, that’s probably enough for now, from this document.
Exhibit two is discussions at last week’s Advisory Committee on Mathematics Education. Andrew Hall, director general of the AQA exam board (England’s largest GCSE and A-level board) worried about a sixth issue which I think is related to “Education by Numbers”: pupils being entered early for GCSE subjects.
This is an issue on which Mr Hall has spoken of having concerns before. At the conference, he documented a rise in the number of maths GCSEs taken by “15-year-olds and younger” [I take it this means those in years 10 and below] from 32,908 in 2008 to 83,179 in 2010. The latter figure represented some 11 per cent of the total entry for maths last year, he said.
He said: “That’s an almost three-fold increase in three years. I absolutely expect, from what I’m seeing in entry patterns this year, to see a significant increase again in 2011. Not just in maths; English has seen the same pattern.”
“I think there are some serious causes for concern here…For some students, early entry may be a really good thing. Those students who are particularly strong performers, and who will continue to get good maths education, may benefit. But my question is: what about the others?
“Are the pressures of league tables pressurising teachers to enter students early to bank a grade C, in order then to focus on those who did not get there? Is this going to impact on students post-16? I venture to suggest it will, but we need to get the evidence.”
He added: “This is not just about maths. Across a whole range of specifications, we are seeing students entered early. It may be the result of modularisation.” [More GCSEs have become modular in the last few years].
One audience member, who said he worked with “gifted mathematicians”, said some were put off persisting with the subject if they took it early and then had a gap without maths before entering the sixth form.
To be clear, I don’t think I’m qualified to judge whether early entry is always a good, or a bad, thing for a pupil. I took maths exams a year early myself throughout the latter years of secondary school, but that was a judgement made by my school on its merits, for me, since league tables and other institutional results pressures did not exist in the 1980s.
My concern, which clearly is shared by Mr Hall, is that results pressures for the institution, rather than the long-term learning needs of the pupil, may be playing a large part in decisions. This calls to mind an article I wrote last year in which there was evidence of a school entering pupils for GCSE maths and English early in order to attempt to “bank” a C grade, then removing them from that class if they achieved it in order to concentrate on achieving a C in other subjects. This, a former member of staff at the school told me, simply neglected to consider any need to give the pupil a chance to score higher than the C in the summer exam, since the C grade was all that mattered to the school’s headline statistics.
I also remember a head telling me, a few years back, that modular exams were far better because they afforded the school greater control over the eventual result. She could not, she said, take the risk of any surprises in the form of children underperforming on the big day and thus dragging the school’s published numbers down. And this was from someone highly sceptical about results-driven hyper-accountability. Again, this is not an argument for or against modules, but a suggestion that results considerations for the school are influencing decision-making.
So, Mr Hall’s concern could be summarised: “Results pressures on schools may be helping to push pupils towards being entered for exams early. I’m worried it’s not in all of their long-term interests.”
Mr Hall was also pressed, at the conference by an academic from Southampton University, on two points also mentioned in Ofsted’s history teaching report. Specifically, was he concerned about exam board-endorsed textbooks, and what about teachers only getting professional development through courses run by exam boards targeted at improving pupil performance in particular exams?
On textbooks, Mr Hall seemed, I think, to be acknowledging the issue and suggesting there might be ways of addressing it by seeking to employ senior examiners in-house at the awarding body. (Currently, all examiners, I think, are employed on a freelance basis. Mr Hall seemed to be suggesting that part of the issue was that they needed to supplement this income by publishing textbooks).
He said: “There is this thing called restraint of trade. Can we prevent someone who is an examiner, off their own bat, writing a textbook?”
He added: “I think there are some issues around the ways in which awarding bodies choose to engage the people who work as examiners. We are looking at what’s the right mix for us. Do we want to move some of those people into the organisation so that we can reward them more appropriately so that they do not need to do those things?”
On the training courses, he said: “One of the things I believe very strongly, as an organisation, is that we should not just offer help to teachers for our own specifications. We are using our charitable status to try to offer developmental opportunities across a broader spectrum.”
One final exhibit would be last month’s Wolf review of 14-19 vocational education, which suggested that schools were being pushed towards non-GCSE exams because of the worth of some of these qualifications to the institution for league table (ie league table and the rest of the results apparatus) purposes. I’ve blogged about that here.
A summary of one of Wolf’s concerns might be: “Pupils pushed towards GCSE-equivalent qualifications which might not help them in the long term because of the weight these qualifications are given in league table indicators.”
Right, to sum up, then, I would come back to the original two debates, mentioned at the top of the piece. Because these two competing priorities aren’t often enough, I think, expressed directly against each other, I think it would be interesting to imagine a hypothetical conversation between two people – each supporting either viewpoint mentioned at the top –and using evidence uncovered here.
The conversation, I think, might go something like this:
First person: “You know, we really need results pressures on schools of the current type because, without them, teachers would just let down pupils with poor teaching.
Second person: “But look, you can see evidence here that teaching to exam-endorsed textbooks are damaging at least some pupils’ history lessons and leaving them underprepared for higher education.”
1st person: “I know, but we need these results pressures in schools.”
2nd person: “But we have evidence that some pupils are being steered away from taking history, even though they want to take the subject, because of results pressures in schools.”
1: “I know, but we need these results pressures in schools.”
2: “But schools are moving towards a two-year KS3, partly because of the pressure on them to improve their GCSE results, and inspectors of history, [and the Historical Association, as it happens], think this is limiting many pupils’ experience of the subject in a way that does not happen in other countries.”
1: “I know, but we need these results pressures in schools.”
2: “But we know that in one in five schools visited, teachers’ only professional development is offered by exam boards in relation to particular exams.”
1: “I know, but we need these results pressures in schools.”
2: “But we know that in some primary schools, subjects such as history and geography are marginalised in the year leading up to the national tests.”
1: “I know, but we need these results pressures in schools”
2: “But we know that hundreds of thousands of pupils are being entered early for GCSEs, and there are worries that this might be educationally less than ideal, but informed by results calculations on the part of schools.”
1: “I know, but we need these results pressures in schools.”
Results pressures seem to be the rock on which our education system is now built, with any other consideration seemingly having to negotiate its way around them.
OK, I caricature this debate a bit, though not that much. Ministers of both this and the previous government will say that actions proposed through, for example, the Wolf review or Labour’s 2009 assessment inquiry which scrapped KS2 science tests have offered a case-by-case approache to mitigating the problems, and Mr Hall clearly appears to be taking them seriously.
Some will say that the three “exhibits” reported on above are not the only perspectives worth considering, either. (I want to look a bit more closely at what I think are contradictions in Ofsted’s own approaches on some of these issues, but that will have to wait a bit).
As ever, I’m not interested in blaming schools for what might be perceived as the more negative aspects highlighted by Ofsted, Mr Hall and the Wolf review, but just questioning whether this system as a whole – including the signals it sends to teachers about the overwhelming emphasis to be placed upon exam success on the measured indicators – is supporting good learning, or not.
Given the number of issues documented in just a few weeks, I wonder why we still aren’t linking these two debates properly, and looking more fundamentally at the real impact, for good or ill, of statistics-driven schooling.