Stay up to date with notifications from The Independent

Notifications can be managed in browser preferences.

Education: Can a flawed system offer parents added claim to value?

How Government league table plans will damage below- average schools.

Carol Fitzgibbon
Thursday 26 November 1998 00:02 GMT
Comments

Your support helps us to tell the story

From reproductive rights to climate change to Big Tech, The Independent is on the ground when the story is developing. Whether it's investigating the financials of Elon Musk's pro-Trump PAC or producing our latest documentary, 'The A Word', which shines a light on the American women fighting for reproductive rights, we know how important it is to parse out the facts from the messaging.

At such a critical moment in US history, we need reporters on the ground. Your donation allows us to keep sending journalists to speak to both sides of the story.

The Independent is trusted by Americans across the entire political spectrum. And unlike many other quality news outlets, we choose not to lock Americans out of our reporting and analysis with paywalls. We believe quality journalism should be available to everyone, paid for by those who can afford it.

Your support makes all the difference.

So the progress of pupils in secondary schools in England will not be ranked from A to E in the league tables next week. Only schools making above-average progress will be identified after protests from head teachers about the inaccuracy and unfairness of the Government's controversial proposal.

The Government's decision to listen, at least in part, to the very serious objections of education professionals is welcome, but we are still left with a flawed system which provides bad information to the consumers it purportedly wishes to serve. Schools should not be judged by this year's progress measure from the Department for Education and Employment.

The idea of value-added measures in schools is generally seen as a way to make school league tables somewhat fairer by taking some account of each school's intake. They tell you whether each pupil has made more or less progress than similar pupils in other schools.

This is fascinating information that can raise interesting discussions about why certain pupils progressed exceptionally well (high value-added scores) and which ones kept up with the average (zero value-added scores) and which ones fell behind (negative value-added scores). When all the value-added scores for a subject are averaged, this average value-added score shows the general progress of the class.

By comparing the results of pupils who had similar achievements previously, we are comparing like with like. For example, by taking account of the results of pupils in their tests taken at 14, at the end of key stage three (KS3), with their GCSE results this year, we get fairer indicators of a school's contribution than is provided by the raw league table results. In other words, it is important that each pupil's KS3 average is considered when evaluating that student's GCSE score in any subject.

Very unfortunately, the Department for Education and Employment has computed a measure that will only provide an approximate indicator of value-added (the "progress" measure recently sent to schools). For GCSE results, the DfEE measures are not based on measuring the progress of each pupil but on comparing the average score for the whole school at KS3 with the average score for the whole school two years later at GCSE. These average scores may be based on different groups of pupils, since some pupils will have left and others will have joined during the two years. Schools with a proper pupil-level value-added system will put more store by that than in the DfEE's unfortunate approximations. Single numbers to represent the complexity of a whole school are of little use except for generating league tables that help sell newspapers. There is so much variation in progress within each school that a single number is a poor representation.

Our advice to the Qualifications and Curriculum Authority was that value- added scores be published by curriculum area, if at all. Furthermore, if value-added scores are used simplistically to rank schools and put them into categories, about half of all schools will - surprise surprise - be below average.

This is inevitable since that is how averages such as value-added indicators, are calculated. Furthermore, from their own experience with proper pupil- by-pupil value-added systems, thousands of schools already know full well that value-added varies from year to year.

This variation must be expected - indeed it is predictable; it would happen even if nothing in the school changed apart from the new group of pupils being taken in each year. The amount of this natural "sampling variation" can even be calculated from the size of the cohort. Only in the context of these only-to-be-expected variations should departments and schools be evaluated. To some extent these problems can be solved by a proper pupil-by-pupil value-added system but there are other problems - more subtle but perhaps more serious - and they relate to social exclusion and the costs to society of driving schools to drive out pupils. School exclusions have gone up about 600 per cent since the performance tables were first published.

Since no one knows exactly what works in raising achievement, the surest way is to get rid of pupils who are not going to make good progress. This consequence of focusing on examinations, examinations, examinations instead of education, education, education is extremely serious and must be the subject of consultation. Schools need to be allowed to work with difficult pupils rather than exclude them, but in order to do so, those schools need to be assured that they will not suffer in the tables.

Maybe schools could be allowed not to count 5 per cent of pupils in the performance tables. This 5 per cent might be a way to avoid "sink schools" and to reduce social exclusion. Alternatively, schools might disallow a larger percentage but would need to justify each case to inspectors. There urgently needs to be consultation with schools on what kind of arrangements would work most fairly and reduce exclusions.

The main use of a pupil-by-pupil value-added system is to enable a school to undertake self-evaluation and to monitor the performance of every department. Schools using value added systems look at the fluctuation from department to department and from year to year. This profile of a school is far more informative for staff and governors than a single index.

Speaking at a recent National Union of Teachers/DEMOS conference, Charles Clarke MP, said that schools with good self-evaluation systems could be subject to lighter inspections. This would be a very welcome development indeed and will bring school inspections more in line with those applied to colleges by the Further Education Funding Council.

Another use of detailed pupil-level data lies in monitoring how the examination system is working on a national scale. For example, for the last two years it appears that Single Science may have been too severely graded. This finding serves to illustrate the fact that a proper value-added system will calculate pupil progress only in comparison with pupils taking the same syllabus. Only in the framework of the same syllabus, the same examining and standard-setting process, can fair comparisons be made.

The writer is Professor of Education and Director of the Curriculum, Evaluation and Management Centre at the University of Durham

Join our commenting forum

Join thought-provoking conversations, follow other Independent readers and see their replies

Comments

Thank you for registering

Please refresh the page or navigate to another page on the site to be automatically logged inPlease refresh your browser to be logged in