Court bundles update: automation
Since the COVID-19 pandemic, the number of remote hearings has continued to rise constantly. So much that Andy Carter MP introducedThe Courts...
General enquiries: 0800 064 0204
eDisclosure: 020 7242 9601
Reprographics: 020 7405 9178
57 Carey Street, London, WC2A 2JB
Email: info@legastat.co.uk
DX 247 Chancery Lane
General enquiries: 0800 064 0204
eDisclosure: 020 7242 9601
Reprographics: 020 7405 9178
57 Carey Street, London, WC2A 2JB
Email: info@legastat.co.uk
DX 247 Chancery Lane
2 min read
Greg Stewart : Dec 20, 2022 9:33:00 AM
The European Union is considering a clampdown on artificial intelligence (AI) systems that are considered to be a high risk to people’s rights or safety.
The Artificial Intelligence Act, which is set to be binding law in 2023, carries proposed fines of up to €30m (£25.75m) or 6% of global revenue for breaches. The draft act is the world’s first attempt to create a region-wide horizontal legal framework for ethical AI applications. The AI systems are divided into four categories: unacceptable, high-risk, limited risk and minimal risk. High-risk systems would be subject to a conformity assessment, and providers would need to establish a risk management system.
The key areas of consideration in the Artificial Intelligence Act are:
The EU’s Artificial Intelligence Act is set to become law in 2023 and will require companies to test their high-risk AI systems for bias at every stage of their development. Providers will also be required to provide documentation on the AI system’s performance and compliance.
It is unclear how the EU’s Artificial Intelligence Act will specifically impact the use of AI in eDisclosure in the UK’s post-Brexit world. However, the UK may choose to adopt similar regulations to those outlined in the act in order to ensure that AI systems used in eDisclosure are fair and unbiased. This could require electronic disclosure providers in the UK to invest in AI systems that are designed to and proven to be free of bias and to implement processes to regularly test and monitor their AI systems for bias. It may also require eDisclosure providers to provide documentation on the performance and compliance of their AI systems.
At Legastat, we routinely use the array of AI technology within the Axcelerate suite to drive efficiency and value in our customers’ eDisclosure and Managed Legal Review projects. We continue to watch these developments with interest. In the meantime, please get in touch today if you want to know more about Legastat’s eDisclosure and other Litigation Support Services.
Since the COVID-19 pandemic, the number of remote hearings has continued to rise constantly. So much that Andy Carter MP introducedThe Courts...
In a recent blog article, we examined seven reasons why the paper bundle may remain an integral element of the legal profession for many years and...
Legastat is a business with foundations built on paper reproductions. Seventy years ago, well before the digital age, our printing presses would run...