The UK court’s position on Generative AI (GenAI)
Reality vs hallucination Cambridge dictionary’s word of the year for 2023 is “hallucinate”. You may think that hallucination is something only humans...
General enquiries: 0800 064 0204
eDisclosure: 020 7242 9601
Reprographics: 020 7405 9178
57 Carey Street, London, WC2A 2JB
Email: info@legastat.co.uk
General enquiries: 0800 064 0204
eDisclosure: 020 7242 9601
Reprographics: 020 7405 9178
57 Carey Street, London, WC2A 2JB
Email: info@legastat.co.uk
Cambridge dictionary’s word of the year for 2023 is “hallucinate”. You may think that hallucination is something only humans do, but in this context, it refers to the mistakes made by computers - specifically large language models (LLMs) – which can generate completely false answers, often supported by fictitious citations.
The UK government’s 2023 White Paper introduced five core principles for the deployment of safe and ethical AI. In March 2025, a revised Artificial Intelligence (Regulation) Bill was reintroduced in the House of Lords. However, as it currently stands, the interpretation and application of the AI principles are left to individual sectors and their regulators.
Closer to home, the Ministry of Justice and the UK Jurisdiction Taskforce are working on a statement to clarify how the laws of England and Wales regard AI-related harms and liability.
On the judiciary side, there is currently no definitive UK case law regarding the use of generative AI in dispute resolution.
eDiscovery providers have increasingly used GenAI to interrogate large data sets, create reports on given issues or summarise key documents. GenAI is also useful in investigations helping clients discover issues they were not initially aware of, assess the legitimacy of claims, and evaluate whether to proceed with disclosure or settlement.
In the absence of specific court guidance, we utilise GenAI on a case-by-case basis. Until legislation and case law establish clear frameworks for its application, it is the duty of ediscovery professionals to clearly communicate the benefits and limitations of these tools, enabling clients to make informed decisions.
Despite the regulatory challenges, providers can implement measure to address situations where LLMs are prone to hallucination and generating false answers:
In summary, as legislation and case law regarding GenAI continue to develop, eDiscovery professionals play a key role in creating workflows and quality assurance practices that will naturally influence industry standards and court acceptance.
If you would like insights on how GenAI could be used in your case, please do not hesitate to contact us.
Sources:
Reality vs hallucination Cambridge dictionary’s word of the year for 2023 is “hallucinate”. You may think that hallucination is something only humans...
New working practices introduced in the last few years have seen many changes in the way we live and work. We have seen changes in the way our...
Many clients involved in litigation who communicate via text messages question why their lawyers stress the importance of collecting their mobile...