Don't trust exam results, says marking expert

A-levels and GCSEs are only 'rough guide' to students' ability

Education Editor,Richard Garner
Thursday 22 April 2010 00:00 BST
Comments
(GETTY IMAGES)

The day of reckoning is getting closer for nearly a million teenagers due to sit A-level and GCSE exams this summer. As their parents wait anxiously, though, a leading expert on assessment offers some comfort for those pupils facing disappointment on the results days in August.

Exam grades should be treated as only a rough guide to pupils' ability, according to Professor Roger Murphy from the School of Education at Nottingham University. In a paper for a conference on exam standards next week, he argues that A-level and GCSE results can be "dangerous" if we invest too much faith in them and that they are "susceptible to misinterpretation".

"Most examinations require the person taking them to undertake a sample of tasks and do not attempt to provide a comprehensive examination of the area under assessment.

"Give the candidate a different sample of tasks and almost certainly they will produce a different performance. Give them the tasks on a different day and their performance may vary again. Give their responses to a range of different examiners to mark and again you will probably get different judgements about the quality of their performance," says Professor Murphy.

Next week's conference – organised by Cambridge Assessment, the parent group for the Oxford and Cambridge and Royal Society of Art (OCR) board – comes at a time of heightened frenzy over the importance of exam results.

Pressure to succeed has reached an all-time high with record numbers of pupils applying to go to university as a result of the UK's current economic woes.

In addition, many teenagers are stranded abroad because of the fallout from the Icelandic volcano when some modern languages oral exams and practical tests are due to start within the next week.

The Joint Council for Qualifications, which represents all the major exam boards, has already moved to reassure them that they can put in a claim for special consideration if they believe their preparations have been disrupted in the run-up to examinations.

Professor Murphy, the author of several books on assessment, may be cautious about putting too much reliance on individual grades, but he doubts whether there is any better way of getting more accurate information. He just warns against placing too much reliance on results.

With the ever-increasing pass rate at A-level usually greeted with claims that the exams have been "dumbed down", Professor Murphy says: "There is much in life that we might like to compare 'scientifically' and can't.

"It is generally not possible to come up with the precise answer to questions such as: 'Was Fred Perry a better tennis player relatively speaking in the 1930s than Andy Murray is now?'. In the same way, it doesn't make a lot of sense to look at trends in national public examination statistics over periods of time to try to find an easy answer to questions about whether exam standards are going up or going down."

Because the curriculum changes year on year, "there is little chance that minor fluctuations in national statistics will reveal great insights for those avidly wanting to know if standards are going up or down," he says.

Professor Murphy is one of five key speakers at the conference to be held in London next Thursday. His contribution coincides with a call from Simon Lebus, group chief executive of Cambridge Assessment, for the Government to take a more "hands-off" approach to the content of exams.

"Since the creation of the national curriculum in 1988, the British state has taken upon itself an ever-increasing role in mediating between subject communities, higher education, professional societies, employers, teachers and examination designers in defining the content of syllabuses and their examinations," he says. As a result, the "users" of the qualifications no longer talk to the "producers" – the exam boards, according to Mr Lebus.

If they have complaints, they can only raise them through official government regulatory bodies. "The only way to maintain standards is for the Government to stand aside and let higher education, employers and subject specialists talk directly to the exam boards once again," he concludes.

John Bangs, the assistant secretary of the National Union of Teachers and another contributor to the same seminar, raises the spectre of rival exam boards making their questions easier to answer in order to boost their take-up by schools.

"Do exam boards themselves lower standards in order to make their products more attractive?" he asks. "I'm sure they do not but it is a question that needs answering and nailing. It also begs the question about whether we need a single examination board for England."

Exams: A brief history lesson

The first exam to be introduced by any of the existing examination boards dates back to 1858 (see box below).

At that time examiners, wearing full academic dress, turned up at the highly selective schools taking the exam to ensure it was administered properly.

In 1919 a school certificate was introduced to select those who would go on to university. The content varied from region to region.

It was only after the 1944 Education Act, which ushered in universal state education up to the age of 14, that the first real national system of examinations came in.

A-levels were introduced in 1951 but 0-levels did not take off until the early 1960s – with CSEs, their equivalent for the less academic, becoming widespread by the early 1970s. Many children still left school at 14 and went straight into employment without qualifications.

The rise of comprehensive schools, teaching all abilities, led to a clamour for one examination for all pupils at the age of 16, and in 1988 O-levels and CSEs were replaced by the GCSE.

Since then, A-levels have been reformed with the introduction of the AS-level (worth half an A-level and taken at the end of the first year of the sixth-form) in 2001. The latest qualification to come on stream is the Government's flagship diploma, which will be completed by pupils for the first time this summer.

Although exam results have always been important for individual students, the media frenzy that accompanies both A-level and GCSE results really only took off in the early 1990s – probably because schools were being measured on them for the first time, with results being published in school league tables.

The pressure on schools to do well translated into pressure on pupils to do well, too.

Past paper: Questions from the first exam, set in 1858

1. Divide 2,322,500 by 1858

2. Describe accurately the situation of the following places: Genoa, Londonderry, Mecca, Rio de Janeiro, Singapore

3. Name in order the wives and children of Henry VIII.

4. In what three ways was our Lord tempted in the wilderness?

Answers 1. 1,250; 2. Italy, Northern Ireland, Saudi Arabia, Brazil, South-east Asia; 3. Catherine of Aragon, Anne Boleyn, Jane Seymour, Anne of Cleves, Kathryn Howard, Katherine Parr. Children: Mary, Elizabeth, Edward.

4. Hunger, power and wealth, acclaim.

Join our commenting forum

Join thought-provoking conversations, follow other Independent readers and see their replies

Comments

Thank you for registering

Please refresh the page or navigate to another page on the site to be automatically logged inPlease refresh your browser to be logged in