Legastat Blog

The EU’s Artificial Intelligence Act: What might it mean for eDisclosure providers?

Written by Greg Stewart | Dec 20, 2022 9:33:00 AM

The European Union is considering a clampdown on artificial intelligence (AI) systems that are considered to be a high risk to people’s rights or safety.

The Artificial Intelligence Act, which is set to be binding law in 2023, carries proposed fines of up to €30m (£25.75m) or 6% of global revenue for breaches. The draft act is the world’s first attempt to create a region-wide horizontal legal framework for ethical AI applications. The AI systems are divided into four categories: unacceptable, high-risk, limited risk and minimal risk. High-risk systems would be subject to a conformity assessment, and providers would need to establish a risk management system.

The key areas of consideration in the Artificial Intelligence Act are:

  1. The European Union is considering legislation to regulate AI systems that pose a high risk to people’s fundamental rights or safety.
  2. The Artificial Intelligence Act is set to be binding law in 2023 and carries proposed fines of up to €30m or 6% of global revenue.
  3. The act is the world’s first attempt to create a horizontal legal framework for ethical AI applications.
  4. AI systems will be divided into four categories: unacceptable, high-risk, limited risk and minimal risk.
  5. High-risk systems will be subject to a conformity assessment, with providers required to establish a risk management system.
  6. The act requires that high-risk systems be tested for bias at every stage of their development.
  7. Providers will be required to provide documentation on the AI system’s performance and compliance.

The EU’s Artificial Intelligence Act is set to become law in 2023 and will require companies to test their high-risk AI systems for bias at every stage of their development. Providers will also be required to provide documentation on the AI system’s performance and compliance.

It is unclear how the EU’s Artificial Intelligence Act will specifically impact the use of AI in eDisclosure in the UK’s post-Brexit world. However, the UK may choose to adopt similar regulations to those outlined in the act in order to ensure that AI systems used in eDisclosure are fair and unbiased. This could require electronic disclosure providers in the UK to invest in AI systems that are designed to and proven to be free of bias and to implement processes to regularly test and monitor their AI systems for bias. It may also require eDisclosure providers to provide documentation on the performance and compliance of their AI systems.

At Legastat, we routinely use the array of AI technology within the Axcelerate suite to drive efficiency and value in our customers’ eDisclosure and Managed Legal Review projects. We continue to watch these developments with interest. In the meantime, please get in touch today if you want to know more about Legastat’s eDisclosure and other Litigation Support Services.