This browser is not actively supported anymore. For the best passle experience, we strongly recommend you upgrade your browser.
| 1 minute read

38 UK cases involving hallucinations (AI or otherwise), judicial caution in the Family Court around AI assisted witness evidence and the future of legal training.

In this legal article, I reflect first on a series of thoughtful discussions I have had over the past few weeks with legal students who have been engaging closely with AI and the law.

Those conversations have given me pause, particularly when considered alongside a statement, made in a different context, that has been widely shared on social media and has prompted thoughtful debate among some within the legal profession:

“…If I was to talk to a class of undergrads right now, I would be telling them to get really unbelievably proficient with these tools. I think to the extent that even those of us building it, we’re so busy building, it’s hard to have also time to really explore the almost the capability overhang even today’s models and products have, let alone tomorrow’s. And I think that can be maybe better than a traditional internship would have been, in terms of leapfrogging yourself to be useful in a profession…”

Demis Hassabis (World Economic Forum Annual Meeting 2026)

I also consider the thirty eighth UK hallucination case that I have identified, this time from the Employment Tribunal. It is not the thirty eighth chronologically, but the thirty eighth I have found. The case is Chandra v Royal Mail Group and it adds another important example to the developing picture of how these issues are arising in practice.

The article then turns to a recent Family Court decision, M v F (Fact Finding Hearing) [2026] EWFC 22. In that case, a witness had been assisted by AI in creating a witness statement and was questioned in detail about how the tool had been used. Despite that scrutiny, the judge remained unclear on the precise process involved. The judgment reflects genuine concern and a notably careful approach to the weight that could properly be placed on that evidence.

Finally, I examine a rare and sobering decision from the United States in which the court concluded that misconduct warranted terminal sanctions following submissions containing false legal citations. The case, Flycatcher Corp v Affable Avenue LLC, et al, demonstrates how seriously courts are prepared to respond when professional obligations are breached in this context.

My full legal article is available here.

"My current view is that it is important for pupils, trainees and all of us as professionals to become proficient with these tools. That is because we need to understand what court users are relying on and in some cases what judges themselves may be encountering, if we are to represent our clients properly and avoid being blindsided by well deployed AI. That proficiency matters not only in assisting clients and courts, but also in identifying misuse, error and overreach when it arises."