An Adjudicator’s Toolkit

AI as a tool - "jolly useful or jolly dangerous"?

Sir Geoffrey Vos on the questions we should be asking about AI and justice, plus a note on managing backlogs

Ian R. Mackenzie's avatar
Ian R. Mackenzie
Oct 17, 2025
∙ Paid

As I mentioned in my last newsletter, I am preparing for some presentations in November about Artificial Intelligence (AI) and administrative justice. In this edition, I will set the scene for those presentations through a discussion about a recent speech from the United Kingdom.

The view from my office

Sir Geoffrey Vos, the Master of the Rolls1 in the United Kingdom gave a speech this week about AI. His view is that AI is just a tool like any other: “Like a helicopter or a chainsaw, in the right hands can be jolly useful, in the wrong hands jolly dangerous”.

In his speech, Vos recognizes the use of AI for legal research (with the appropriate precautions and review of the final result). What he raised as an ethical issue is the use of AI in judicial decision-making, noting that the answer to that question is “truly difficult and potentially troubling”.

After acknowledging that there may be some judicial decisions that people might want to be made by machines (personal injury damages, for example), he provides three reasons why we should be cautious about allowing that to happen:

First, judicial decisions are the last resort for everyone in our society. If the decision is wrong, at least after an appeal, nothing can be done about it in most cases – Parliament is unlikely to change the law to reverse a run-of-the-mill AI-generated judicial decision made by a machine as to personal injury damages.

Secondly, machines, even those sporting the much-vaunted artificial general intelligence when it comes, will arguably never be able completely satisfactorily to mimic a human’s emotion, idiosyncracy, empathy and insight.

Thirdly, with an AI judicial decision, you will be getting something generated from the state of intelligence at a given point in time, without the application of developing human thought. That may be fine for a while, but where will it leave us in generations to come? … it might be very difficult for human thought processes to influence the law of the future in the way that many people might think remained appropriate.

Vos called for a “serious debate” before it is too late to consider: (a) what human rights people should have in the light of ever more capable AI, and (b) what the consensus is on what people want human judges rather than machines to decide in the future.

On the first question, Vos asked whether a machine-made decision “can ever be properly regarded as having been made by an ‘independent and impartial tribunal established by law’ under article 6 of the European Convention on Human Rights and Fundamental Freedoms.

On the second question, he asked “what do we, as humans, want human judges rather than machines, to decide in the future? What do we, as a society, want machines to decide about our lives in preference to human judges, and ought we to have a choice”:

Do we want judges to feed the facts of our cases into an AI tool, to see what an AI tool, or even a range of AI tools, think the answer should be? Or would we rather stick with the grumpy old judge – or even – the vibrant young judge – whose experiences may differ one from another, and whose idiosyncrasies we cannot predict, and only the Court of Appeal can correct.

He proposed no answers - but I think those are questions that need to be discussed. Fundamentally, it comes down to people’s attitudes toward discretion. AI is not human and therefore not capable of exercising discretion. There are many people (often politicians) who do not appreciate the important role of discretion in the administration of justice. However, the exercise of discretion can be an act of compassion and empathy - which is part of the essence of what it means to be human.

Share

Approaches to managing backlogs

Two recent articles illustrate some of the ways that tribunals or courts can address a large influx of cases; approaches that involve more than just additional resources.

User's avatar

Continue reading this post for free, courtesy of Ian R. Mackenzie.

Or purchase a paid subscription.
© 2026 Ian R. Mackenzie · Privacy ∙ Terms ∙ Collection notice
Start your SubstackGet the app
Substack is the home for great culture