This browser is not actively supported anymore. For the best passle experience, we strongly recommend you upgrade your browser.
| less than a minute read

Ayinde Principles and AI Hallucinations at the UKIPO

This is the first decision in the UK I’ve seen which cites Ayinde and applies its principles to another incident involving false case citations or AI hallucinations.

This case stands out from many of the previously tracked examples because it predominantly involves the more subtle, and, in my view, potentially more dangerous, hallucination types (likely Hallucination Types 6 or 7).


The decision identifies:

Grounds with a list of cases: “cases were all real, but three of the purported “quotes” were not found in the decisions cited…”

A skeleton which “ included two cases with complex (but incorrect) references, and four cases with the correct reference. The cases were all followed by a summary of a few words to a few lines setting out what the case decided. In three of the cases, the summary misrepresented the case substantially.”

And another skeleton which “…. included three cases which existed and were correctly cited. But it was unclear to me the cases cited stand for the propositions claimed…”

My full case analysis can be read here.
 

Tags

artificial intelligence