UK’s AI bill to focus on ChatGPT-style models

0 3

Stay informed with free updates

UK tech secretary Peter Kyle has reassured major technology companies that a long-awaited artificial intelligence bill will be narrowly focused on the most advanced models and will not become a sprawling “Christmas tree bill” to regulate the nascent industry.

Kyle told leading tech companies that the AI bill expected later this year would focus exclusively on two things: making existing voluntary agreements between companies and the government legally binding, and turning the UK’s new AI Safety Institute into an arm’s length government body, according to people briefed on the discussions.

“It will not become a Christmas tree bill,” Kyle told executives from Google, Microsoft and Apple, in response to concerns raised that other regulation would be added to the bill during the legislative process.

Kyle and chancellor Rachel Reeves met executives from a number of leading tech companies and investors — also including Facebook owner Meta — on Wednesday morning to discuss how the new government can support the tech and AI sectors to boost UK growth. 

Sir Keir Starmer was expected to announce an AI Bill in the King’s Speech earlier this month, but stopped short of including it among the 40 named pieces of legislation. 

Instead, the King said the Labour government would “seek to establish the appropriate legislation to place requirements on those working to develop the most powerful artificial intelligence models”.

The bill, expected to be ready for its first reading by the end of the year, will exclusively focus on ChatGPT-style foundation models — large AI models made by a handful of companies that can parse and generate text and multimedia — according to senior government sources.

Sir Keir Starmer’s AI bill marks a departure from the strategy employed by former prime minister Rishi Sunak, who was reluctant to push for legal interventions in the development and rollout of AI models too soon for fear that tough regulation might stymie industry growth. 

“We are third only to the US and China in the size of our fast-growing technology sector, and we lead the world when it comes to AI safety,” Sunak told MPs.

Late last year, Sunak’s government launched its AI Safety Institute which evaluates AI models for risks and vulnerabilities.

During the UK’s AI Safety Summit in November, leading companies including OpenAI, Google DeepMind, Anthropic, Amazon, Mistral, Microsoft and Meta signed a “landmark” but not legally binding agreement with governments including the UK, US and Singapore.

Under the deal, governments that sign up would be able to test the companies’ latest and forthcoming models for risks and vulnerabilities before they were released to businesses and consumers.

These companies made further voluntary commitments in Seoul earlier this year, including committing “not to develop or deploy a model at all” if severe risks could not be mitigated.

Senior UK government officials believe there is an urgent need to make these voluntary agreements legally binding to ensure that companies already signed up to the agreements cannot renege on their obligations if it becomes commercially expedient to do so.

A consultation on the contents of the bill is expected to be launched in the next few weeks and should last about two months, according to senior officials.

Making the AISI an arm’s length body would strengthen its role as an independent body and reassure companies that it does not have the government “breathing down its neck”, according to one senior official.

Starmer’s government is keen for the AISI to take a leading role setting global standards for AI development that could be used by governments around the world.

Further regulation to tackle and protect against potential harms associated with AI, including the use of intellectual property to train models without permission or compensation, will be looked at separately to this bill, they added.

Read the full article here

Leave A Reply

Your email address will not be published.

This website uses cookies to improve your experience. We'll assume you're ok with this, but you can opt-out if you wish. Accept Read More

Privacy & Cookies Policy