Archive for March, 2011

…he’d say something about academies’ English Baccalaureate results

 Monday, March 28th, 2011

Last autumn, Michael Gove appeared on the BBC’s Question Time and launched a passionate attack on what he claimed was a glaring injustice within English education.

Children from disadvantaged backgrounds, he suggested, were being let down by a system which assumed they could not succeed in traditional academic subjects.

He said: “If you look at what happens in France, or in Holland, or in Canada, or in Singapore, or in Hong Kong, or in any of the countries which have got education systems many of which are much better than our own, they expect children at the age of 16 to have a rounded education.

“[This] means they are fluent in their own language…[and expected to] master the sciences, to study a humanities subject like history or geography, which build human sympathy. That’s the rounded education they expect.

“And the problem we have had in this country, as an historical problem, is we have automatically assumed an academic education is only for a minority: only 25-30 per cent of people can succeed. 

“Well, that is rubbish.”

“All of us are facing an educational challenge in this country,” he continued. “How can we ensure that we end the patronising twaddle of the last 30 years that assumes that just because kids come from working class backgrounds, they cannot succeed in academic subjects?

“With my background, I am determined to ensure that people have that chance. And when people say ‘oh, you are demoralising children because they cannot succeed’, what I hear is the next generation being written off because we do not have high aspirations for them.

“One of the reasons I am in politics is to make sure that we transform our education system so that kids who have been written off in the past at last have the chance to succeed.”

Well, all this was greeted enthusiastically by some commentators.

But is this passion real?

If so, you have to wonder why Mr Gove has not taken a much closer interest in what has been going on in his favourite type of school: academies. His lack of interest might suggest his emotion is synthetic. Or to be more charitable, when a seemingly heartfelt desire to do what he thinks is the best thing by working class pupils runs up against the demands of political ideology, ideology wins.

Before going into the detail on academy results, I should state something now.

It is this: I am a “what works” type of person.  I don’t like ideology, or the idea that something should be implemented because it fits a theoretical schema or model of how things ought to run best. This means I’m not one to dismiss any type of organisation of schooling out of hand. I am, as might be guessed from the length of some of the blogs on this site, a details person.

Any consideration of academies, then, should be carried out on the basis of as full as possible an understanding of the effects of these new schools across a local area. Academies are sometimes sold on the basis that their governance supports innovation, and that they have brought dynamism to England’s system. Evidence on this should be weighed against that relating to other arguments, including the financial implications of these new school arrangements, the impact of academy freedoms on equity, their effect on teacher recruitment and retention, their effect on local admissions and the interaction with accountability to local people. Above all, we should try to get an understanding of the detail of what has happened in academies and other schools in their localities.

The trouble is that we never get this fair reckoning, in my experience, because politicians of both this and the former government who shape the debate are so committed to the policy, as a structure of running schools which they prefer to the traditional model of state education, that they don’t present evidence even-handedly. GCSE results press releases have consistently highlighted academies’ results as better than those of non-academy schools, based on faster average improvements in academies than other schools on the main GCSE performance measures, when there are other ways of looking at what has been going on, and even though basic questions such as whether the pupil make-up of each academy has changed or not compared to its predecessor school(s) are not addressed in the statistics. Remarkably, this cheerleading presentation of academy results has continued even after the publication of results in Michael Gove’s new “English Baccalaureate” measure.

In January, league table results which saw Mr Gove introducing that new performance indicator – the English Baccalaureate – were revealing, although not surprising, in what they documented about the statistics of academies.

These schools, usually set up through a contract agreed between a sponsor and central government, had long been said by Mr Gove and his Labour predecessors to be improving their headline GCSE results at well above national average rates.

This was based on the main figures published under Labour: the proportion of their pupils achieving five A*-Cs including English and maths, in GCSEs or vocational equivalents.

But the “Baccalaureate” figures, which ranked schools on the proportion of their pupils achieving good GCSEs in not just English and maths, but also two sciences; history or geography; and a language, painted a very different picture. Many academies were right at the bottom of the English Baccalaureate league tables, with nearly a third of those with results to report recording zero per cent of their pupils achieving this new benchmark.

Some three quarters of the academies with results to publish had five per cent or fewer – that is one pupil in 20 – achieving the EBacc, compared to a national average figure that had 16 per cent of the cohort achieving the new benchmark. At all but 24 of the 187 academies with results, performance on the EBacc was below 10 per cent.

The press release put out by Mr Gove’s department – which would be very worried about the situation in academies, you would expect, if it shared his concern about pupils missing out on a broadly academic education as he defines it – said nothing about these statistics.

Instead, it only mentioned academies in reference to the old measure, proclaiming that: “Academies continue to show improvements in getting five good GCSEs (or iGCSEs or equivalents) including English and mathematics at a faster rate of 7.8 percentage points compared to other schools, which improved by 4.5 percentage points.” Government comment on schools’ results in the EBacc focused, then, on the 16 per cent figure for schools as a whole, which was seen as low and might underscore, in the public mind, a view that radical change – including academy status – was needed. In the press release, this seemed to be underlined by the inclusion of comments from two academy managers without, again, reference to academies’ EBacc results.

Many have said that, and might argue here, that the retrospective EBacc measure is unfair on schools. But it seems to me that someone introducing this measure simply with a desire to highlight the performance of schools in the subjects contained within it, without prejudice towards any particular type of school, would use the EBacc results to include a least a heavy element of caveat in what was being said about academy results overall. Yet the spin – or a desire to present academies as always better than other state schools – seemed to be taking over.

In recent weeks, the TES has been following these results up with stories, first, that only six per cent of pupils in schools run by the government’s three favourite academy companies achieved the English baccalaureate this year, compared to the national average of 16 per cent.

The government, or backers of academies, including those teaching in them, might respond here by saying that this comparison is unfair. Labour’s academies, which were the only ones open and able to provide the GCSE figures on which these statistics are based, were set up mostly in disadvantaged areas, with challenging intakes, so it would not be right to try to compare them to the national average, which will include schools with more middle-class  pupils.

But that, as Mr Gove’s comments to Question Time should make clear, is not a defence open to him if he wants to say what is going on in academies is not significant. For he has argued that we need to have high aspirations for good academic achievement for children from disadvantaged backgrounds. All schools should be getting good results according to his new benchmark, and the results of academies were especially concerning, you would expect him to say if he was being consistent.

Even more damningly, the TES also produced statistics claiming that academies actually fared worse, not just than the national average English Baccalaureate figure, but when compared to non-academy schools with similar intakes.

“No pupils gained the English Baccalaureate in 31 per cent of the academies that entered pupils for GCSEs and their ‘equivalents’ last year,” it said.

“But only 17 per cent of non-academy comprehensives and secondary moderns with the same proportions of pupils on free school meals with special educational needs completely failed to score on the EBac.” [My italics].

Going slightly further up the league tables, the TES found that 73 per cent of academies achieved less than five per cent on the EBacc measure, compared to only 55 per cent of non-academy comprehensives and secondary moderns with comparable numbers of special educational needs and free school meals.

Not once, though, as far as I am aware, has Mr Gove made any comment about this disparity. It begs the obvious question: is something peculiar going on in academies which is producing these numbers? Mr Gove doesn’t appear to have  looked very hard for an answer.

Indeed, in a letter to academy principals last month, after the EBacc results had been made public, Mr Gove began: “The Academy programme has already proved itself an exciting, powerful and dynamic force for higher standards in our schools.”

He added: “Sponsorship has been key to transforming some of our most challenging schools bringing added drive, vision, resources and expertise, to create a culture of higher aspiration.”

Although the letter talked about the importance of getting all schools above a new floor standard” of 35 per cent or more of pupils achieving five or more GCSE A*-Cs including English and maths, there was no mention anywhere within it about academies’ results in the EBacc.

Now, if Mr Gove were really concerned to look without prejudice at the effects of government policy in particular types of schools, he might also have wanted to consider a report produced in 2009 by the Historical Association, which contained some very interesting statistics, relevant to academies and other schools, on the exposure of pupils to one academic subject Mr Gove has been very concerned to emphasise.

This report, based on a survey of 644 schools including 23 academies, found that only 59 per cent of academies taught history as a discrete subject in year seven, which was the lowest of any of the four categories of schools. (The others were non-academy comprehensives, grammars and independents). Some nine per cent of academies had a two year key stage 3 curriculum, allowing pupils to drop history at 13, compared to six per cent in comprehensives, three per cent in grammar schools and one per cent in the private sector.

Nearly 48 per cent of academies reported that year seven pupils spent an hour a week or less on history, compared to 30 per cent in comprehensives, 12  per cent in grammars and seven per cent in independents. There was a greater spread of teaching time in comprehensives, however, with 38 per cent likely to devote more than 90 minutes to history a week, a higher figure than for grammar school and fee-charging schools. “The academies remain the least likely to give such generous allocations,” said the report. “Less than 20 per cent of them thought it worth investing more than 90 minutes a week in the subject.”

Academies also seemed to be reducing the time allocated to the subject faster than other types of schools. More than half reported that the time devoted to it in year seven had dropped since the previous year, compared to one third of comprehensives, while time reductions in year 8 and 9 were also most widely reported in academies (35 per cent, compared to 20 per cent for comprehensives).

In terms of GCSE history numbers, academies were the only type of institution where greater numbers reported a decrease in entries for the subject (33 per cent of academies, compared to 17 per cent of comprehensives) than an increase (19 per cent of academies, compared to 27 per cent of comprehensives).

The report also found extensive evidence, with no particular type of school mentioned, of history struggling for GCSE numbers in the face of competition from other subjects, with vocational qualifications “which in many cases lower-attaining students were being compelled to take” mentioned in a quarter of cases.

The report includes the following quotation, not mentioning what type of institution was involved. “Students have been deliberately denied an opportunity to study history by forcing them down vocational or academic pathways. GCSE students have also been taken off courses against their wishes to do BTEC qualifications in six months so that the school can boost its position in the league tables. This has happened to students who were otherwise on target for a C/B in history but were doing badly on their other optional subject.”

 It quotes a teacher, from an academy, saying: “History is seen to be too academic! Entrance to the course is based on Fischer Family Trust predictions, and students who are predicted lower than a B are not allowed to study the course…We are also not allowed to run ‘entry level’ GCSE courses for students with specific needs, as that is not thought to be meeting the attainment targets for the academy.”

Education by Numbers indeed, in both cases, and seemingly classic examples of the need of the institution to raise its statistics being put above individual student concerns, at least as these history teachers see it.

Ofsted’s report on history teaching in primary and secondary schools, published this month, also documented lower numbers taking history in academies. It found: “Entries for GCSE history from academies were significantly lower than for maintained schools overall,” at 20 per cent of students in academies compared to 30 per cent for non-academy state schools (and 48 per cent in fee-charging independent schools).

In 2009, the TES reported on a study by academics at the university of East Anglia and Southampton, which also found results pressures as a heavy influence on schools’ decisions over history.

The academics are quoted as saying, with no particular type of school identified: “Pupils’ interests were not necessarily put first. For the senior leadership team in some schools, the first priority was the school’s examination profile.”

Beneath the TES story, there was the following comment:

“I used to work in an academy in London, and as I was leaving I had to rank every pupil in year 8 as an A, B or a C. A means that they could get an A or a B at GCSE. Therefore history appeared in their option forms. The B category were pupils who were borderline C/D. The C meant that they were predicted grades G to D. Neither categories B or C had history on their option forms! They were encouraged to take other less rigorous subjects.

“Even though I had known students previously predicted Ds and Es get outstanding results, who went on to do exceptionally well at A-level, and some even went on to do history at university.

“What was most upsetting was the case of one student, with a range of learning difficulties. He loved history, and orally he was phenomenal. He was put in category C, and was therefore being guided down a different pathway. He was devastated that he would not be able to take history in year 9-11. His mother rang the school, and explained that it was likely whatever course he was entered into, he would be unlikely to either pass or do very well in, so why couldn’t he at least take a subject he enjoyed?

“The plea fell on deaf ears and the boy was placed in some random BTEC or GNVQ course taught by some bland paper pushing academy drone who was being shipped in to ‘sort’ the school out of failing pupils and failing teachers.”

The notion of pupils being forced into taking subjects for the good of the school’s statistics reminded me of a conversation I had in late 2009 with the parent of a child at a school which had just converted to be run by the Harris Federation of South London Schools.

I was following up on a story in the local paper on the anger of the mother, Moira Macdonald, that her daughter, studying at Harris Academy Purley, near Croydon, had been forced to take a sports BTEC worth two GCSEs.

The academy replaced Hailing Manor school, which was under pressure because its headline results were below the Labour government’s “floor targets”, at short notice in September 2009. In May of that year, Dan Moynihan, chief executive of the Harris Federation, wrote to parents saying that students would be “required” to do a BTEC in sport.

In the old school, French had been compulsory, but it became an option at the academy. The new academy’s options structure allowed pupils to take up to two optional GCSEs alongside English, maths, science, enterprise, religious studies and the BTEC sports course. The BTEC sports course (worth two GCSEs) would be taught in only three periods a week, said its options booklet, rather than the five the school was devoting to maths GCSE, which is worth one.

The parent, Moira Macdonald, told me her daughter had opted for geography and history and therefore had had to drop the French, even though she would have preferred not to rather than taking the sports BTEC because she had no interest in pursuing a career in sport.

Ms Macdonald said: “The academy is promising massively improved results and I am not surprised considering they are making soft subjects compulsory and dumping hard-earned GCSEs.

“The Harris Academy overrode the GCSE core subjects set for my daughter and her colleagues before the takeover, in order to improve their league table results.

“This is no way to educate kids – they need to be taught proper subjects and come away with proper qualifications.”

These quotations were featured in a report by the Civitas think tank in 2009, which tried to look behind the secrets of academies’ success. It asked the question which I think anyone should ask if confronted with statistics showing rapid improvements: how were they being achieved? It offers substantial evidence of what it says are some of the approaches within academies, of pupils being pushed towards “less challenging” [Civitas’s words] subjects and qualifications “to drive up headline results”. So this investigation, asking if there was a specific “academies effect” at play behind their generally improved headline results – ie searching for reasons behind it rather than the often-cited but too vague “sponsors’ ethos” claims - was available to Mr Gove but I know of no detailed reaction to it.

Now, I thought I’d say here what I think has been going on in academies.Perhaps it would be better to start with a question: if they have lower numbers of pupils taking academic subjects such as history, why is this? Well, I guess there are two responses.

The first is to say that, as mentioned above, the original academies set up under Labour tended to serve – though were not exclusively confined to – disadvantaged communities. All other things being equal, it could be argued that one might expect these schools to struggle to recruit pupils to the traditional academic subjects such as history and languages that Mr Gove now focuses on through the English Baccalaureate.

There is likely to be some truth in this. However, the TES figures suggest that not only do academies have lower results  on the EBacc measure when compared against the national average for all other schools  - driven partly, would be the assumption, by academies’ possibly lower take-up for EBacc subjects – but when compared to schools with similar intakes.

Further, as suggested above, this is not a defence open to Mr Gove if he is truly to be seen as a champion of academic education for the vast majority of pupils. If he really cared about that, he would be speaking out passionately against some of this practice.

To me, a second response suggests itself. It is this: the practice in academies could be seen as a kind of “Education by Numbers” squared, or “Education by Numbers” amplified. There are such large institutional forces on them to raise results on the published indicators that the kind of practices documented would be expected to have occurred, perhaps to a greater degree than elsewhere in the maintained sector, because of results pressures.

Of course, there is no evidence that these approaches, including  pushing pupils away from the academic subjects now included in the Ebacc towards those which have up to now carried high weight in league table calculations, were and are going on in all academies. But I do think there is likely to have been this general tendency, based on a few facts about academies, which run as follows.

Academies were usually set up, under Labour, in response to perceived problems of low attainment: the results of the schools they replaced were said not to be good enough. I have observed, for example, how raw exam statistics, in relation to some schools, were virtually the only evidence put forward as the rationale for the extensive restructuring and investment that comes with setting up an academy.

In this context, almost the raison d’etre of these new schools was to improve headline GCSE statistics. If they didn’t do so, one could ask why the change to academy status, which was often controversial and which had been backed with often tens of millions of pounds of investment in new buildings for individual schools, had come about. I suspect, also, though have never seen evidence other than a reference to the odd “performance related bonus” in a job adverts, that academy leaders have had performance pay tied to raising published exam numbers.

The published results are not just high-stakes locally for individual schools, of course. They are also important for academy chains, whose reputations – in a system which really does not look very hard for alternative evidence – rest on them.

And politically, at a national level, of course, the success or failure of the academies scheme was seen to be judged almost exclusively on whether one or at most two numbers rose: the central indicator of the proportion of pupils achieving five or more GCSEs or vocational equivalent at A*-C including English and maths, and arguably the old measure without the English and maths stipulation.

In this kind of atmosphere, academies will have been under even more pressure, I believe, to game the system in obvious ways – such as a very sharp focus on C/D borderline pupils and use of alternative qualifications – in order to deliver those results, against the context of them being demanded quickly from, often, very challenging pupil cohorts.*

But the impact on individual pupils in the chase for better results – in terms of denying them precisely the kind of curriculum Mr Gove claims now to want for all schools – can be large.

I should say here – and some will no doubt challenge this – that I don’t want to criticise non-GCSE qualifications. The idea that schools, pupils and parents should be free to choose the courses they think are right for the pupil, with a full understanding of the likely benefits to the individual in the long-run, is powerful. My concern is that our current system, including the lack of scrutiny of what has been going on beyond statistics which have placed a surely-too-high weight on some non-academic courses, has pushed schools to take decisions based on the worth of a course to the institution, rather than to the individual.

It was revealing, I think, that in a recent exchange I had on twitter on this subject with Sam Freedman, Mr Gove’s adviser, Sam predicted that academies’ results would improve quickly on the Ebacc indicator now that it has been introduced. But this only seemed to confirm, in my mind, that academies have been exceptionally focused on league table ranking metrics, ie on the results for the institution. Mr Freedman may suggest that this is OK, now, since this government has sorted things out so that the metrics are now better aligned with pupils’ interests. I think that is a very optimistic reading. It also implies a lack of interest in what has happened under the old system which, if you follow Mr Gove’s logic, has resulted in disadvantaged pupils being wrongly pushed towards courses which were not in their long-term interests.

In investigating school results and the impact of non-GCSE qualifications on league table rankings, I have been in contact for several years now with Roger Titcombe, the former head of a community comprehensive whose school eventually was turned into an academy.

Roger’s argument throughout was that he passionately believed in what he saw as an important strand of the comprehensive ideal. This was the right of all pupils – from whatever background – to pursue a broad liberal education, in which all would have access to a range of academic subjects.

He saw that as coming under threat from the academies movement, because these new schools were so desperate for better results, some would sacrifice that ideal by pushing children towards qualifications mainly because they would help the school’s data.

Not just through academies, but throughout the schools system, a new class divide was at risk of emerging, he thought, with those from better-off families concentrating on academic courses and the rest pushed towards non-GCSEs in their options. But he believed, and the evidence presented above would suggest, that there is an “academies effect”, which makes them particularly susceptible to the type of behaviour described here.

I think, actually, that Roger’s ideals in this respect are very similar to those put forward by Mr Gove. However, in the absence of any other explanation, it seems the Education Secretary’s desire never to be seen to criticise the actions of academies overwhelms his stated commitment to speaking out for the options of those from disadvantaged backgrounds. Ideology, then, in this new government, is king.

* Remember, also, that no school can have complete control over the results a child achieves after sitting at a desk to complete exam papers, so institutions under pressure seem to me to be particularly incentivised to go in for strategic approaches that afford them greater control.

- Warwick Mansell

posted on March 30th, 2011

Friday, March 25th, 2011

Warwick Mansell

I have been interested in two debates in English education for several years now.

One starts along the lines: “Standards are not high enough. We need to hold our schools to account properly so that they improve exam results for all young people, who so desperately need better grades. We also need to use results data to target our efforts to help pupils do better.”

The other says: “Schooling driven by performance indicators is creating a whole host of negative consequences, which go to the heart of pupils’ educational experiences.”

I often feel like these debates take place almost in parallel, with very little acknowledgement of how they are inextricably linked, and little attempt at communication between the two. Frequently, it seems as if people accept the force of either argument, without realising that they are so closely related as to be almost two sides of the same coin.

This has been going through my head again in recent weeks, as I come across yet more evidence supporting the second of those statements. As always, it’s not necessary to look very hard for this stuff; it just keeps coming.

Exhibit one was a report on history teaching in English schools by Ofsted.  

Now, it’s important to get the context right here: the Ofsted report went out under a press release headline: “History a successful subject”, and offered plenty of support for the way the subject is taught.

For example, it said “history teaching was good or better in most primary schools” among those inspected in this programme from 2007 to 2010, and that “history was successful in most of the secondary schools visited because it was well taught, notably in examination classes at GCSE and A level”.

In secondary schools, it said: “the large majority of…history teachers were very well-qualified. In the large majority of the schools visited, the quality of the provision also reflected the strong leadership of the history departments”.

It added: “The subject knowledge of the specialist history teachers in the secondary schools visited was almost always good, often it was outstanding and, occasionally, it was encyclopaedic. Inspectors found so much good and outstanding teaching because the teachers knew their subject well.”

These central findings might have been lost on readers of some media coverage when the report came out.

So we are talking about a generally successful subject, taught by enthusiastic  professionals. However, within that generally positive context, there were several instances offering more evidence underlying the dangers of “Education by Numbers” – ie exam-results oriented practice being adopted which, the report would suggest, is bad for underlying learning. 

First, probably the starkest negative finding in the report related to exam-endorsed textbooks. Ofsted said:

“In recent years, more textbooks have been written specifically for the examination course specification, both at GCSE and A level. The textbooks, often written by the chief examiners for the courses, are generally endorsed by examination boards and gain the status of a ‘set text’. The history teachers in the schools visited were well practised in supplementing these with additional materials as necessary. However, it was clear that, at A level, the mushrooming of course-endorsed and linked textbooks was having a negative impact. They stultified teachers’ thinking and restricted students’ progress. The weaker students relied on the textbook as being sufficient preparation for the external examinations and were less willing to read beyond the ‘set textbook’. Their written and oral work revealed how their understanding of the topics they studied was narrowed. It also meant that students were not as well prepared to meet the challenges of higher education where independent learning and extensive reading were required.”

Damning stuff, I thought. To put this in almost-punchy soundbite terms: “Exam-endorsed textbooks – in the way they are sometimes used – are stultifying teachers’ thinking; restricting students’ progress; narrowing their understanding”.

Second, the report offered more evidence of pupils being steered away from history because of “league table” (I put it in quotes here because this is often a shorthand, I think, for wider hyper-accountability/results pressures) concerns. It said: “In some of the schools visited the students were restricted in their subject options at GCSE and some had been steered towards subjects which were seen to be less demanding than history.”

It continued: “Entry level…is intended for students who find GCSE too demanding. However, the declining number of students taking this examination reflects not only a lack of confidence that entry level meets the needs of those for whom it was intended, but also decisions by curriculum leaders to avoid a course that does not contribute significantly towards their school’s attainment profile”. [my italics].

Ah, OK, so this could be summarised: “Pupils steered towards certain subjects because of a school’s need to improve its figures”.

Third, the report offers criticisms of the move towards completing key stage 3 in two years, rather than three. I think this is related to the themes of “Education by Numbers” because I believe some of the calculation of schools in making this move is to increase the time they have to focus on raising GCSE performance. That, of course, can be seen as a rational move to make, from the individual pupil’s point of view, if it helps secure better grades. But hyper-accountability/ “league table” calculations are also likely to be a factor. This means, of course, that a child not going on to study history GCSE will stop at 13, rather than 14. And Ofsted’s criticism of this is on educational grounds, although admittedly from a subject-specific point of  view, that of a specialist history inspector.

Among the quotes on the report on this are: “The national curriculum orders and programmes of study in Key Stage 3 have led to much high-quality teaching and learning in history. However, in one in five of the secondary schools visited, curriculum changes, such as the introduction of a two-year Key Stage 3 that allowed some students to give up history before the age of 14, and thematic approaches to the curriculum, were associated with teaching and learning that was no more than satisfactory.” [I know the use of “no more than” satisfactory will jar, at least to a teacher audience, since my dictionary defines satisfactory as “fulfilling expectations or needs”, but you get the point].

“In 14 of the 58 secondary schools visited…whole-school curriculum changes [including a two-year KS3] were having a negative impact on teaching and learning in history at Key Stage 3.”

 It goes on: “In England, history is currently compulsory for students beyond the age of 14 and those in schools offering a two-year Key Stage 3 course can stop studying history at the age of 13. England is unique in Europe in this respect. In almost all the countries of the European Union, it is compulsory to study history in some form in school until at least the ages of 15 or 16.”

The report also points out that children do not get access to specialist history teachers in primary school, meaning that some will only have specialist teaching in the subject for two years of their school careers.

So that would be: “some pupils are only getting two years’ history teaching in secondary school, and this is not a good thing”.

Fourth, there were problems with teachers’ professional development – because of an over-reliance on exam-specific training – in some schools. “Access to training for history was an increasing concern for all teachers…In 28 of the 64 secondary schools visited in which this aspect was specifically inspected, access to subject training was only satisfactory and in 10 of the schools it was inadequate. In one in every five schools visited, training by the examination board was , and had been for several years, the only type of out-of-school subject-specific professional development for history teachers.”[my italics]

It goes on to say that this training, in schools doing it well, was only one of a number of approaches.

So that would be: “In one in five schools, teachers’ only professional development is geared to teaching towards particular exams”.

Fifth, the inspectors found that, in “a minority” of primary schools, foundation subjects such as history had been “squeezed”. “In year 6 in particular, teachers said to inspectors that the foundation subjects were ‘not a priority’”. Year 6, of course, embraces the run-up to key stage 2 tests.

So I think it’s fair to infer the following statement from this: “Some primary schools neglecting subjects such as history in drive to raise test scores in English, maths and science.” Ofsted points out,  of course, that some schools don’t do that and still get good results, but I don’t think the statement above is unfair.

Ok, that’s probably enough for now, from this document.

Exhibit two is discussions at last week’s Advisory Committee on Mathematics Education. Andrew Hall, director general of the AQA exam board (England’s largest GCSE and A-level board) worried about a sixth issue which I think is related to “Education by Numbers”: pupils being entered early for GCSE subjects.

This is an issue on which Mr Hall has spoken of having concerns before. At the conference, he documented a rise in the number of maths GCSEs taken by “15-year-olds and younger” [I take it this means those in years 10 and below] from 32,908 in 2008 to 83,179 in 2010. The latter figure represented some 11 per cent of the total entry for maths last year, he said.

He said: “That’s an almost three-fold increase in three years. I absolutely expect, from what I’m seeing in entry patterns this year, to see a significant increase again in 2011. Not just in maths; English has seen the same pattern.”

“I think there are some serious causes for concern here…For some students, early entry may be a really good thing. Those students who are particularly strong performers, and who will continue to get good maths education, may benefit. But my question is: what about the others?

“Are the pressures of league tables pressurising teachers to enter students early to bank a grade C, in order then to focus on those who did not get there? Is this going to impact on students post-16? I venture to suggest it will, but we need to get the evidence.”

He added: “This is not just about maths. Across a whole range of specifications, we are seeing students entered early. It may be the result of modularisation.” [More GCSEs have become modular in the last few years].

One audience member, who said he worked with “gifted mathematicians”, said some were put off persisting with the subject if they took it early and then had a gap without maths before entering the sixth form.

To be clear, I don’t think I’m qualified to judge whether early entry is always a good, or a bad, thing for a pupil. I took maths exams a year early myself throughout the latter years of secondary school, but that was a judgement made by my school on its merits, for me, since league tables and other institutional results pressures did not exist in the 1980s.

My concern, which clearly is shared by Mr Hall, is that results pressures for the institution, rather than the long-term learning needs of the pupil, may be playing a large part in decisions. This calls to mind an article I wrote last year in which there was evidence of a school entering pupils for GCSE maths and English early in order to attempt to “bank” a C grade, then removing them from that class if they achieved it in order to concentrate on achieving a C in other subjects. This, a former member of staff at the school told me, simply neglected to consider any need to give the pupil a chance to score higher than the C in the summer exam, since the C grade was all that mattered to the school’s headline statistics.

I also remember a head telling me, a few years back, that modular exams were far better because they afforded the school greater control over the eventual result. She could not, she said, take the risk of any surprises in the form of children underperforming on the big day and thus dragging the school’s published numbers down. And this was from someone highly sceptical about results-driven hyper-accountability. Again, this is not an argument for or against modules, but a suggestion that results considerations for the school are influencing decision-making.

So, Mr Hall’s concern could be summarised: “Results pressures on schools may be helping to push pupils towards being entered for exams early. I’m worried it’s not in all of their long-term interests.”

Mr Hall was also pressed, at the conference by an academic from Southampton University, on two points also mentioned in Ofsted’s history teaching report. Specifically, was he concerned about exam board-endorsed textbooks, and what about teachers only getting professional development through courses run by exam boards targeted at improving pupil performance in particular exams?

On textbooks, Mr Hall seemed, I think, to be acknowledging the issue and suggesting there might be ways of addressing it by seeking to employ senior examiners in-house at the awarding body. (Currently, all examiners, I think, are employed on a freelance basis. Mr Hall seemed to be suggesting that part of the issue was that they needed to supplement this income by publishing textbooks).

 He said: “There is this thing called restraint of trade. Can we prevent someone who is an examiner, off their own bat, writing a textbook?”

He added: “I think there are some issues around the  ways in which awarding bodies choose to engage the people who work as examiners. We are looking at what’s the right mix for us. Do we want to move some of those people into the organisation so that we can reward them more appropriately so that they do not need to do those things?”

On the training courses, he said: “One of the things I believe very strongly, as an organisation, is that we should not just offer help to teachers for our own specifications. We are using our charitable status to try to offer developmental opportunities across a broader spectrum.”

One final exhibit would be last month’s Wolf review of 14-19 vocational education, which suggested that schools were being pushed towards non-GCSE exams because of the worth of some of these qualifications to the institution for league table (ie league table and the rest of the results apparatus) purposes. I’ve blogged about that here.

 A summary of one of Wolf’s concerns might be: “Pupils pushed towards GCSE-equivalent qualifications which might not help them in the long term because of the weight these qualifications are given in league table indicators.”

Right, to sum up, then, I would come back to the original two debates, mentioned at the top of the piece. Because these two competing priorities aren’t often enough, I think, expressed directly against each other, I think it would be interesting to imagine a hypothetical conversation between two people – each supporting either viewpoint mentioned at the top –and using evidence uncovered here.

The conversation, I think, might go something like this:

First person: “You know, we really need results pressures on schools of the current type because, without them, teachers would just let down pupils with poor teaching.

Second person: “But look, you can see evidence here that teaching to exam-endorsed textbooks are damaging at least some pupils’ history lessons and leaving them underprepared for higher education.”

1st person: “I know, but we need these results pressures in schools.”

2nd person: “But we have evidence that some pupils are being steered away from taking history, even though they want to take the subject, because of results pressures in schools.”

1: “I know, but we need these results pressures in schools.”

2: “But schools are moving towards a two-year KS3, partly because of the pressure on them to improve their GCSE results, and inspectors of history, [and the Historical Association, as it happens], think this is limiting many pupils’ experience of the subject in a way that does not happen in other countries.”

1: “I know, but we need these results pressures in schools.”

2: “But we know that in one in five schools visited, teachers’ only professional development is offered by exam boards in relation to particular exams.”

1: “I know, but we need these results pressures in schools.”

2: “But we know that in some primary schools, subjects such as history and geography are marginalised in the year leading up to the national tests.”

1: “I know, but we need these results pressures in schools”

2: “But we know that hundreds of thousands of pupils are being entered early for GCSEs, and there are worries that this might be educationally less than ideal, but informed by results calculations on the part of schools.”

1: “I know, but we need these results pressures in schools.”

Results pressures seem to be the rock on which our education system is now built, with any other consideration seemingly having to negotiate its way around them.

OK, I caricature this debate a bit, though not that much. Ministers of both this and the previous government will say that actions proposed through, for example, the Wolf review or Labour’s 2009 assessment inquiry which scrapped KS2 science tests have offered a case-by-case approache to mitigating the problems, and Mr Hall clearly appears to be taking them seriously.

Some will say that the three “exhibits” reported on above are not the only perspectives worth considering, either. (I want to look a bit more closely at what I think are contradictions in Ofsted’s own approaches on some of these issues, but that will have to wait a bit).

As ever, I’m not interested in blaming schools for what might be perceived as the more negative aspects highlighted by Ofsted, Mr Hall and the Wolf review, but just questioning whether this system as a whole – including the signals it sends to teachers about the overwhelming emphasis to be placed upon exam success on the measured indicators – is supporting good learning, or not.

Given the number of issues documented in just a few weeks, I wonder why we still aren’t linking these two debates properly, and looking more fundamentally at the real impact, for good or ill, of statistics-driven schooling.

- Warwick Mansell

No Comments
posted on March 25th, 2011

Wednesday, March 15th, 2011.

England’s secondary maths curriculum is likely to become “more challenging” for pupils from 2013, one of the government’s leading civil servants said today.

Jon Coles, who has a key role in the national curriculum review which was launched in January, suggested that while the primary maths curriculum in this country was quite similar to that of “top-performing” countries internationally, this was not the case from the age of 11 onwards.

He set out the thinking behind the review and – perhaps boldly, given that the review is only just over six weeks old – offered a taste of what some conclusions with regard to maths might be during a talk to the annual conference of the Advisory Committee on Mathematics Education at the Royal Society in London this morning.

Mr Coles, director general for education standards at the Department for Education, said: “What I think [the review] will mean, from the early evidence beginning to come through in maths, is that it will probably mean some increasing challenge, especially in the secondary phase.

“There’s a great deal of commonality between the national curriculum in primary schools in this country and in the highest-performing jurisdictions.

“There are some differences, in primary, in timing and sequencing [of when things are taught], and we do have one area where we do a great deal more than other countries, which is data handling, in which we are quite unusual in this country.”

However, in general, he said, there were not huge differences between what was taught before the age of 11 here and good practice elsewhere.

But he added: “At secondary level, we will see a pushing up of challenge and expectation. That would be my guess on the basis of what the review has seen so far.”

On the wider thinking behind the review of the curriculum for 5- to 16-year-olds, for first teaching in 2013, Mr Coles set out the idea behind it of trying to learn from what happened in other countries which do well in international studies such as the OECD’s “PISA” tests and TIMSS, the Trends in International Mathematics and Science Study.

He said: “The review team are looking extremely systematically at what happens in the top-performing jurisdictions in the world.

“Specifically, they are interested in what is put in the curriculum at what age, what is the sequencing, what leads to progression and high performance in these systems, what can we learn from them and what should we transfer into our system?

“The overall aim of the review is to be much less prescriptive…to reassert the balance between the national curriculum and the school curriculum.”

He implied that the idea was to strip down the amount of time taken up by the national curriculum in schools.

He said: “The national curriculum…should be a specification of the core knowledge and principles needed to progress, not a complete specification of everything that schools teach.”

The government would not be specifying how teachers should teach (a move back to the days before Labour entered into the world of prescription over pedagogy with the national literacy and numeracy strategies). And he suggested, I think, that schools would need help adapting to this new world, saying the government would “need to support schools” in doing so.

Summing up, he said: “The task that the review team are undertaking is to come up with a pretty spare, pretty knowledge-focused national curriculum, based on the best international evidence.”

The timescale looks challenging, I think, with curriculum materials due in schools by September next year.  

But Mr Coles added:  “I think what we will see in this review is trying to get draft Programmes of Study out much earlier in the process than has been the case in previous reviews.

“That’s a good and important thing to do.”

Mr Coles also had some interesting things to say about funding. Asked a question about funding for a particular initiative, he said: “Our budgets are under a great deal of pressure.

“It’s true that the DfE has done rather better than many other departments. But we are experiencing a significant change [from spending under the previous regime, I guess] and what we are trying to do is prioritise front-line budgets.

“The most important thing to do is to prioritise schools and colleges and early years budgets.  Doing that, when this was already 80 per cent of our budget means that it will become 90 per cent of our budget.

“So the rest of our budget has halved. Given that within [that part of] the budget are some very big things, like initial teacher training – which I suspect many of you in this room would advise us not to cut – we do not have lots of pots of money.

“That’s a direct result of what’s being done to reduce the budget deficit.

“In the next few years, do not expect us to come up with pots of money for good new ideas. We will have to prioritise, and respond to good ideas, but I suspect not with new money. That’s the situation that the whole of the public sector is going to face.”

Ok, this has largely been a blog without comment from me, because I thought readers might be interested in these words as they stood. I will just add a final comment on Mr Coles’s speech, however, in relation to what he had to say about progress in maths in this country over the past 10 years or so.

Actually, I wondered what Mr Coles would say on this, as he has been at the department a while. Under Labour, he led the 14-19 programme which introduced the now beleaguered diploma qualification. Could his assessment of how things stand on school standards possibly be as bleak as that of his political boss, Michael Gove, who often seems reluctant to offer any sense that things might have improved in any way since 1997, I wondered.

Well, actually Mr Coles, perhaps unsurprisingly but interestingly, offered a more balanced view of matters with regard to maths education.

He told the conference: “Over the last 10 years, just looking back at the figures, we have an awful lot to look at that suggests progress, and that’s good and positive.”

The number of people coming into maths teaching over the last 10 years had doubled, he said, while the number passing* the subject at GCSE had grown very significantly, the numbers taking maths A-level had grown by 50 per cent since a low-point of participation was reached in 2002, and the take-up of further maths A-level had also increased markedly in recent years.

He said international evidence presented challenges. But even here – and I almost choked on hearing the next bit – there were some chinks of light.

He said: “There are some positives. We are the most improved nation in TIMSS, in the international comparisons.”

I nearly choked because, of course, as I have written here, somehow Mr Gove never seems to find the time to mention England TIMSS results in major speeches setting out why he thinks our schools need radical reform. Last November’s white paper is also free of the statistics on maths which Mr Coles presented.

Of course, he did go on to set out the agenda which has been put forward by the Government, saying: “Actually, PISA does present some very large challenges.  Our 15- and 16-year-olds are doing significantly less well than they are in some other countries.

Shanghai’s performance on the last PISA tests put it two whole years on the PISA scale ahead of us, and that gives us a real challenge.”

He also highlighted a recent report for the Nuffield Foundation which documented the low proportions of young people in England, Wales and Northern Ireland persisting with maths after the age of 16.

But  all Mr Coles offered a more balanced view of the evidence than was presented either by Mr Gove, in launching the recent education bill (Gove: “I would love to be able to celebrate a greater level of achievement, but I am afraid that this is the dreadful inheritance that our children face”), or in the white paper on which it was based. I wonder whether those TIMSS figures will ever get a look-in in official documents and government speeches in future. I’m not holding my breath.

This conference also had plenty to debate about the influence of results pressures in schools, which as you would expect I will be writing about in the coming days.

*[I note that there was no reference to this equating to pupils achieving a C grade or better at GCSE. A C grade is not, of course, formally, the cut-off for a “pass”. GCSEs were introduced with a passing scale of grades A-G. It is surprising that even officials are now saying a C grade is a pass, when this is not how the grading system works. I heard the conference chair, Professor Dame Julia Higgins also equate a C grade with a “pass”, and it featured in the Wolf review earlier this month. I understand why it’s happening, but I still find it strange that we have a technical language which defines a pass in one way, and everyone else now seeming to define it in another.]

- Warwick Mansell

1 Comment
posted on March 15th, 2011


Wednesday, March 9th, 2011

Ok, I’ve decided to do something slightly different, here, in the form of a blog largely not written by me, but based on two emails I’ve received in recent months on the vexed and often technical issue of data analysis systems and target-setting.

This may be overly technical for some non-teacher readers of this blog, but I thought I’d put it up here as I get occasional inquiries about the Fischer Family Trust system in particular, and am interested in the implications of how these systems work in the classroom.

What follows are the more-or-less verbatim contents of two emails (reproduced here anonymously but with the authors’ consent) I received re data analysis systems, one from a teacher who seems reasonably positive/pragmatic about the whole experience, and the second who, as you will see, has concerns.

So here is the first teacher, who is a senior leader.

“I have always liked my schools to use two data sources, past performance and CEM Centre (MiDYIS, YELLIS and ALIS), although my current school uses CATs. 

“Raw data informs me as the teacher, but I adjust the targets that I give to students (no student in my GCSE classes is told that they will achieve less than a C, because all can easily achieve that and most can surpass it).  Data, as I tell staff, only provides questions and never answers.  It informs good teaching, but doesn’t make a good teacher.

“I then also tell staff the most important analysis of exam performance is comparing how students did in your class in comparison to other subjects in school. Did they do better with you or elsewhere? Then if they are below the data targets you need to take the mirror test. Do you feel that you did everything to help that student do better (look yourself in the mirror). If you are happy with what you did, move on, but ask ‘can we make adjustments to next years interventions?’

“I do recognise that these sets of data are not perfect and they can only ever be an indicator.  For FFT the worry is because of the inflation or deflation of scores at KS2 because of brilliant or poor teaching. In CEM and CATs students can do worse than they are capable of because of all the factors that can suppress test performance.  However, overall they do produce part of a useful guide and highlight possible underperformance to all staff.

“I’m happy to discuss any of this.  I am no way a zealot, just want all my students to progress so constantly looking for things to improve what I do. I think teachers’ fear of data comes from poor leadership as to how to use it.”

Here is the second email, reproduced verbatim from the start:

“Dear Mr Mansell,

“Thank you very much for your work on testing, hyper-accountability and the many problems in education today. I found your book Education By Numbers to be very thought-provoking, my copy is full of highlights where I was almost shouting out in agreement with many of the points you made.

“I have been teaching maths in the same high school for thirty years, and I find the current obsession with getting the best results for the school very dispiriting.

“I have tried to talk to my head of department and Head Master about improving learning and understanding, but it is a waste of time. They want to meet the targets, so pressure staff and pupils, force pupils to attend extra classes after school or instead of attending morning tutor meetings, but do not consider real educational improvements; too risky?

“Also, the pupils who will never achieve a C are effectively written off by the RAP(raising achievement process)which only targets D to C or C to B, and some troublesome pupils who the FFT say should achieve seem to move on elsewhere so they do not drag the results down.

“The RAP system is being extended to years 7, 8 and 9, not to improve education, but to achieve the magical FFT 5th percentile. Regular testing, split levels i.e. 3a, 3b, 3c etc, when it is doubtful if any teacher can reliably say ‘Jonnie is working at level 3 in algebra’. Levels may be estimated plus or minus one, sublevels are a nonsense, also the use of numbers as labels for ‘levels’ is misleading as the levels are descriptive, categorical data not measurements on a scale.

“One thing that I want to say to you is that the message that is regularly given about 5 or more grade Cs at GCSE being ‘good’ is a disaster for some bright pupils. I have had a few say ‘as long as I get 5 Cs I am doing well’.

“For able pupils C is poor, to get the message across to year 10 and 11 pupils I have bluntly said that for them ‘C means crap’, what they should be getting are 7 or more A*, A, B grades.

“The obsession with grades and levels for the benefit of the institution, instead of a focus on helping pupils to achieve the best for themselves, is a cancer in the education system”.

- Warwick Mansell

posted on March 9th, 2011


Wednesday, March 2nd

I had an interesting chat yesterday with the Ofsted press office. A press officer called me after I wrote an article for the Financial Times*, which was published on Saturday, on the effects of results pressures in schools.

This included the following paragraph:

“Ofsted inspections have, in recent years, focused heavily on statistical indicators of school quality that are largely based on exam performance.”

Ofsted’s argument was that inspections aren’t now as dependent on test/exam data as is commonly perceived. Particularly since the introduction of the latest version of the Ofsted framework, in September 2009, more emphasis is being placed on lesson observation, it was stressed to me. It is also not the case, as is sometimes thought, that schools are being pre-judged, before inspection visits, on the basis of their results.

I have been promised more information on this from Ofsted, and will update this blog on this subject when I receive it. In the meantime, anyone with thoughts or experiences on the current inspection process is welcome, as ever, to leave a comment or get in touch with me at

By the way, I wrote the FT article for a supplement which included league tables of school results. Another article in the supplement included some comments from a university admissions tutor on familiar themes, to readers of Education by Numbers and this website.

The piece, by Liz Lightfoot, included a quotation from Richard Austen-Baker, a law admissions tutor at the University of Lancaster. He said: “The exam boards compete for customers – teachers and students – and what they want are the best possible grades, especially with the pressure from school league tables.”

Dr Austen Baker added, of exam preparation in schools: “I have been told by teachers that they discourage students from wider reading because there is a danger it might introduce them to material which is not in the syllabus and if they use that in their exams instead of material from the exam specification, they will lose marks.”

Anecdotal stuff, of course, from one individual, but I thought I’d mention it as this website is supposed to be documenting views alongside research evidence of the effects of the current system.

 *I think you’ll need to register with the FT site to read this piece, and the one by Liz Lightfoot.

- Warwick Mansell

No Comments
posted on March 2nd, 2011