OpenAI joins opposition to California AI safety bill

0 3

Stay informed with free updates

OpenAI has hit out at a California bill aiming to ensure powerful artificial intelligence is deployed safely and suggested that new controls would threaten its growth in the state, joining a last-minute lobbying frenzy by investors and AI groups to block the legislation.

The bill, SB 1047, threatens “California’s unique status as the global leader in AI,” the company’s chief strategy officer Jason Kwon wrote in a letter to Scott Wiener, the California state senator spearheading the bill.

It could “slow the pace of innovation, and lead California’s world-class engineers and entrepreneurs to leave the state in search of greater opportunity elsewhere”, he added.

SB 1047 has divided Silicon Valley. While there is widespread acceptance of the need to curb the risks of super-powerful new AI models, critics have argued that Wiener’s proposals would stifle start-ups, benefit America’s rivals and undermine California’s position at the epicentre of a boom in AI.

OpenAI is the latest start-up to oppose elements of the bill, and the most prominent — thanks largely to the popularity of its ChatGPT chatbot and a $13bn commitment from its partner Microsoft.

OpenAI supports provisions to ensure AI systems are developed and deployed safely but argues in the letter, which was first reported by Bloomberg, that legislation should come from the federal government, not individual states.

In a response on Wednesday, Wiener said he agreed that the federal government should take the lead but was “sceptical” Congress would act. He also criticised the “tired argument” that tech start-ups would relocate if the bill was passed and said companies based outside the state would still need to comply with the bill to do business locally.

The California State Assembly is set to vote on the bill by the end of the month. If it passes, Governor Gavin Newsom will then decide whether to sign it into law or veto it.

Silicon Valley tech groups and investors, including Anthropic, Andreessen Horowitz and YCombinator, have ramped up a lobbying campaign against Wiener’s proposals for a strict safety framework. Nancy Pelosi, the former House Speaker and California representative, also published a statement in opposition of the bill last week, dubbing it “well-intentioned but ill informed”.

Among the most contentious elements in the senator’s original proposals were demands that AI companies guarantee to a new state body that they will not develop models with “a hazardous capability”, and create a “kill switch” to turn off their powerful models.

Opponents claimed the bill focused on hypothetical risks and added an “extreme” liability risk on founders.

The bill was amended to soften some of those requirements last week — for example, limiting the civil liabilities that it had originally placed on AI developers and narrowing the scope of those who would need to adhere to the rules.

However, critics argue that the bill still burdens start-ups with onerous and sometimes unrealistic requirements. On Monday, US House members Anna Eshoo and Zoe Lofgren wrote in a letter to Robert Rivas, Speaker of the California assembly, that there were “still substantial problems with the underlying construct of the bill”, calling instead for “focus on federal rules to control physical tools needed to create these physical threats”.

Despite criticism from leading AI academics such as Stanford’s Fei-Fei Li and Andrew Ng, who led AI projects at Alphabet’s Google and China’s Baidu, the bill has found support among some of the “AI godfathers”, such as Geoffrey Hinton of the University of Toronto and Yoshua Bengio, a computer science professor at the University of Montreal.

“Bottom line: SB 1047 is a highly reasonable bill that asks large AI labs to do what they’ve already committed to doing, namely, test their large models for catastrophic safety risk,” Wiener wrote on Wednesday.

Read the full article here

Leave A Reply

Your email address will not be published.

This website uses cookies to improve your experience. We'll assume you're ok with this, but you can opt-out if you wish. Accept Read More

Privacy & Cookies Policy