AI regulation – UK Government thinks again?

It’s hard to keep up with the twists and turns of AI regulation at the best of times, but the suggestion this week that the UK Government may be looking again at its opposition to statutory AI regulation poses as many questions as it answers.

It’s hard to keep up with the twists and turns of AI regulation at the best of times, but the suggestion this week that the UK Government may be looking again at its opposition to statutory AI regulation poses as many questions as it answers.

As recently as 22 March, in a Lords debate on Lord Holmes’ private members bill setting out a draft AI regulatory framework that was broadly assumed to be going nowhere, the Government’s response couldn’t have been clearer. A legislative approach was not on the table in the short term, indeed “...legislating too soon could easily result in measures that are ineffective against the risks, are disproportionate or quickly become out of date.”

Now we hear that officials at the Department for Science, Innovation and Technology (DSIT) may in fact be considering a statutory solution for some AI risks, including obligations to share algorithms and carry out testing of high risk models in certain circumstances. This follows hot on the heels of international agreements being reached by the UK on AI safety with both the USA and South Korea, but also warnings published by the Competition and Markets Authority that the small concentration of companies able to leverage the most complex AI foundation models could have an adverse effect on competition in relevant markets.

There is, as yet, no clear understanding of exactly what the Government may have in mind, and according to the reports specific legislative proposals are not imminent, but with the current electoral cycle running out of road, and sector regulators already tasked with laying out their own responses to the (non-statutory) approach outlined in the recent AI White Paper update by the end of April, if the Government is intending to put regulation in place during this Parliament it doesn’t have a lot of time to play with.

Where that leaves businesses developing and deploying AI models in the UK (some of whom have welcomed the prospect of regulation, if only to provide a degree of certainty) is unclear, but we’ll be monitoring developments closely to keep track of any change in approach and its implications.

Disclaimer

This information is for general information purposes only and does not constitute legal advice. It is recommended that specific professional advice is sought before acting on any of the information given. Please contact us for specific advice on your circumstances. © Shoosmiths LLP 2024.

 


Insights

Read the latest articles and commentary from Shoosmiths or you can explore our full insights library.