Skip to content

The Algorithm blame-game; was it a flawed consultation?

Yes, there WAS a consultation;

Admittedly only for two weeks; from April 15th– 29th

And a big majority of the 12,623 respondents agreed with Ofqual’s plan to ‘standardise’ teachers’ predictions, almost to the last detail.

So what went wrong?

First, be clear about the role of a consultation in situations like this. It is a complex, technical issue, and something of a mysterious craft for lay people. A Regulator, caught by an unprecedented and unanticipated crisis, devising as best a formula as it can to replace the examination results for vast numbers of A level and GCSE pupils. It assembles its experts, issues a Rubik-cube style solution … and launches a consultation.

In such circumstances, one is not seeking ideas. In the main, this is not a consultation along the lines of “Have you any suggestions?” It is a failsafe. An opportunity to expose the plan to a wider range of stakeholders in the hope that the consultor learns more about any dangers, difficulties or adverse impacts before proceeding.

Targeting is important. Omit an important group of consultees, and you might miss vital intelligence. In this consultation, over 1,000 schools/colleges responded; over 3,000 teachers in their personal capacity, and over 3,000 parents or carers. There is no real problem with the sample, though one might like to hear more ‘seldom-heard’ voices

The consultation paper itself[1] is inevitably dense and contains 43 questions mostly attracting 10,500 or so responses. Intriguingly, fewer gave a response to the questions on the Equality Impact Analysis,- between 2,000 and 3,000, though fatigue often sets in as people work through lengthy consultation documents. This one was 67 pages and not exactly an easy-read.

The four key questions were:

  • The fundamental decision to request schools, not just to provide their grade assessments but also to rank pupils (Q1). 11,024 responded; 82% agreed or strongly agreed with Ofqual.
  • Statistical standardisation which emphasised historical evidence of schools’ performance (Q14). 10,797 responded but only 54% expressed agreement. 33% disagreed.
  • To disregard any bias re protected categories or socio-economic backgrounds (Q16). 10,469 responded with 64% agreement with Ofqual’s preference.
  • To consider appeals from schools but not from individual pupils (Q23). 10,697 responded but only 47% agreed with this approach. 42% disagreed.

There is a wealth of data in the 179 pages of the published analysis[2]. In the meantime, Ofqual took decisions based upon its consultation and in May published a 27-page rationale for proceeding with the vast majority of its proposals.[3] By most standards this consultation has comprehensive documentation, with at least some evidence that, on less pivotal items, officials took consultee views seriously, amended some details, and modified its plans here and there. The Consultation Decisions document follows best practice insofar as it summarises some of the objections and tries to provide an explanation for those decisions.

Here are two examples on the kind of policy choices that led to such an outcry

  • Re standardisation: “We have decided to adopt the proposed aims but we have decided to reorder them such that aim iii, regarding the method’s transparency and simplicity, appears at the end of the list so as to not overstate its importance.” (at page 8)
  • Re correcting for potential bias: “We have decided to adopt our proposal that the individual rank orders provided by centres should not be modified to account for potential bias regarding different students according to their particular protected characteristics or their socio-economic backgrounds.” (at page 11)

The consultation was not flawless. The phraseology of the questions (..typically  …’To what extent do you agree or disagree that …followed by Ofqual’s preferred answer….) is not particularly good though we have seen worse. And sometimes they come after a ‘hard-sell’ narrative that rather too obviously promotes the preferred answer. Maybe this reinforced the groupthink that led to a widespread consensus that the proposed methodology was generally acceptable. Alternatively, it is quite possible that this consensus was indeed the balance of stakeholder opinion.

However, over the years, experienced analysts of public consultations have learnt not to be over-influenced by the numbers. In the Politics of Consultation[4], Elizabeth Gammell and I explain how the most astute political operators use consultations essentially as an early-warning mechanism to identify issues and impacts that might not otherwise be spotted. They are naturally glad to see support for their proposals, but just occasionally someone makes a comment that gives them pause for thought. The Institute’s long-standing Fellow in Scotland, David Jones calls this the ‘wee nugget”.  He urges all consultors to scour consultation responses for something that might make an administratively-elegant solution a serious political problem. It is often a case of joining the dots together; some officials or politicians have second sight and can see the picture before others.

Such a case occurred two years ago, when the Mayor of London and the Metropolitan Police lost a high-profile judicial review into a decision to close Wimbledon Police Station. A consultee had become well-known for having survived a serious assault, and his life was apparently saved by officers from Wimbledon. He made a significant submission to the consultation and because Sophie Linden, the Deputy Mayor had failed to give it ‘conscientious consideration’ the decision was declared unlawful.[5]  So the question is whether there were arguments of similar force expressed in the Ofqual consultation.

To judge for yourselves, and from a large number of similar contributions, here is a selection of responses as published (complete with typing errors etc) in the Analysis Report:

  • Re Rankings; “I think that asking schools and colleges to rank students is impossible when there are 200 students or more to be ranked within subjects and I don’t believe that anybody has got the necessary overview to be able to distinguish between them particularly at a level when the head of department may have no personal knowledge of individuals.” (Parent at page 11)
  • Re use of historical data “Using centre previous performance is not valid for SEND settings as we have varying cohorts each year. With such small numbers taking GCSE exams (around 10 per year), we can have vastly varied results year to year dependent on the individual needs of the students.” (Teacher – responding in a personal capacity at page 46)
  • It is a flawed system. Using historical patterns is disingenuous, dangerous and fundamentally unfair; this approach assumes far too many things. As a consequence the winners will be those already advantaged but the current system; and the losers will be those already disadvantaged by the current system.” (CEO of a Trust at page 48)
  • Re Improving schools. “Relying too heavily on historical data will disadvantage schools on “turnaround” journeys. These are, almost by their nature, schools serving deprived communities and their students could be further disadvantaged by Ofqual’s approach to the use of historical data to standardise grades.” (Teacher representative group or union at Page 47)
  • “For schools which have struggled in the past and are predicting a steep improvement in results, it would be devastating to use historical results.” (School or college at page 47)
  • Re correction of bias re protected characteristics: “There is a strong body of literature that indicates that those from BAME backgrounds, religious minorities, and lower socio-economic backgrounds would systematically be ranked lower than other students. We urge Ofqual to consider implementing the provision of mitigating circumstances declaration forms that students and parents can submit alongside training to mitigate for unconscious/conscious biases undertaken by all teachers involved in the grade predictions process at each centre.” (Other representative or interest group at page 56 )
  • Re Appeals only form Schools: “…a blanket “no” approach to appeal by students is unfair, students may have good reason and supporting evidence to support their “the grades are wrong” claim – students should have the opportunity to have their evidence submitted and their appeal heard.” (Parent at page 74)
  • “It appears Ofqual are proposing, yet again, only examination centre heads can commence an appeal, rather individual students. This is wrong in any year, but even more wrong in a year when some arbitrary algorithm making standardised adjustments, and a student ranking system is involved. It is absolutely vital that students should be able to lodge their own appeals against grades awarded, and should not be left to an examination centre head being willing to do this.” (Parent or carer at page 77)

Of some solace to the embattled Secretary of State might be this rather unusual response from a parent:-

  • “Your consultation proposals do not meet the terms of the ministerial direction. This requires you to: “It is important that students should have access to a right of appeal if they believe the process was not followed correctly in their case.” The key part of this direction is STUDENTS having access to a right of appeal. You propose that only exam centres can appeal! Do what you were ORDERED to do by the minister!” (at page 75)

Conclusions

Heads are rolling as this piece is written. And more may follow.

The Institute has no particular view of the merits of this or other policy matters. But it is concerned with the value, use and integrity of the consultative process. It found it strange that as the grade results controversy raged, no-one had exclaimed “Hang on, there was a consultation and most people approved Ofqual’s methodology”.  It prompted the thought that maybe there was some irredeemable flaw in it.

Having looked at it, the answer is that, on most standards, this was an acceptable exercise. It is far from perfect. But it did its job, which was to give decision-makers a broader range of views on their proposals. No-one who ploughed through the Analysis document could have been under any misapprehension about the issues. It may be that critics of Ofqual’s proposals were in a minority, but their words – illustrated in this paper – should have rung alarm bells for anyone with political nous.

It is often the case that officials, close to the subject, pressed for time and with little perceived mind-space to consider anything remotely deviant from their bosses’ wishes, will not welcome the ‘if’s and but’s’ that appear in consultation exercises. But this is where political masters must take responsibility. One may speculate if the Secretary of State’s political advisers really read the Analysis Report and weighed up the arguments. Or did they rely upon Ofqual’s contemporaneous 27-page Decisions document and assume that all the necessary ‘conscientious consideration’ had been done. It is yet another example to support the Institute’s policy that Output reports (This is what you’ve said) should be published in advance of the Outcome report (This is what we have decided).

Time was short in this case, but had the Output report (ie the Analysis) appeared first, there is at least the chance that it would have been studied by more people in the field, and, just possibly the red flags that appear in many parts of the document might have been spotted and a significant political embarrassment ameliorated.

One last observation is to wonder what might have happened if the Government in England had not been forced into the U-turn and adopted the same stance as the administrations in Cardiff, Belfast and Edinburgh. Had they soldiered on, defending the Ofqual standardisation, there might well have been a legal challenge. Who might have won?

Based upon the Kohler case, there is a strong possibility that the Secretary of State would have lost, on the grounds that sufficient conscientious consideration had not been given to the consultation.

Future consultations remotely near an algorithm, we suspect, will be accompanied by greater care, possibly an Institute quality assurance, and a larger dose of political attention.

 

Rhion H Jones LL.B

Founder Director

The Consultation Institute, 26 August 2020

 

[1] Exceptional arrangements for exam grading and assessment in 2020:

Consultation on specified general qualifications – GCSEs, AS, A levels, Extended Project Qualifications and the Advanced Extension Award

[2]https://assets.publishing.service.gov.uk/government/uploads/system/uploads/attachment_data/file/886555/Analysis_of_consultation_responses_21MAY2020.pdf

[3]https://assets.publishing.service.gov.uk/government/uploads/system/uploads/attachment_data/file/887048/Summer_2020_Awarding_GCSEs_A_levels_-_Consultation_decisions_22MAY2020.pdf

[4] The Politics of Consultation (2018) by Elizabeth Gammell and Rhion Jones – available through The Consultation Institute https://www.consultationinstitute.org/publications/

[5]  R (ex parte Kohler) v The Mayor’s Office for Policing and Crime [2018] EWHC 1881.

 

About the Author

Rhion Jones is considered a leading authority on Public Engagement and Consultation. A founding Director of the Consultation Institute, he is co-author of “The Art of Consultation” (2009) and “The Politics of Consultation” (2018). He has delivered over 500 training courses and Masterclasses and is a prolific writer on the subject, having written over 350 different Topic papers and over 50 full Briefing Papers for the Institute. Since 2003 over 15,000 person-days of training based on courses he invented have been delivered. Rhion is in demand as an entertaining Keynote Speaker and Special Adviser, particularly on the Law of Consultation, and its implications for Government and other Public Bodies. In 2017, he was awarded the ‘Lifetime Achievement Award’.

Read more about Rhion

Scroll To Top