Excessive regulation could ‘kill’ AI industry, JD Vance tells government leaders at Paris summit

CNN Business
02-11
London CNN  — 

The United States believes that overzealous rulemaking could “kill” the artificial intelligence industry, US Vice President JD Vance said Tuesday, taking Donald Trump’s fight against curbs on AI development to a global stage.

“We believe that excessive regulation of the AI sector could kill a transformative industry just as it’s taking off,” Vance told heads of state and CEOs gathered in Paris for the Artificial Intelligence Action Summit. “And I’d like to see that deregulatory flavor making its way into a lot of the conversations (at) this conference.”

Vance’s address followed the repeal by Trump last month of a sweeping executive order signed by former President Joe Biden that set out actions to manage AI’s national security risks and prevent discrimination by AI systems, among other goals.

“I’m not here this morning to talk about AI safety,” Vance said. “I’m here to talk about AI opportunity.”

While he stressed AI’s potential to improve people’s lives, the powerful technology is replete with risks that many experts think need addressing through regulation. For example, AI can generate images, audio and videos that can be used to make it look like a person did or said something they didn’t. That, in turn, may be used to sway elections or create fake pornographic images.

Even greater dangers posed by the technology range from ChatGPT providing easy access to comprehensive information on how to commit crimes to AI breaking free of human control.

AI can be used to make an autonomous weapon system “that can cause harm to the world,” Manoj Chaudhary, chief technology officer at Jitterbit, a US software firm, told CNN in November.

And in March, a report commissioned by the US State Department warned of “catastrophic” national security risks presented by the rapidly evolving technology, calling for “emergency” regulatory safeguards alongside other measures.

‘Lightning in a bottle’

Vance said the new US administration’s pro-innovation approach to AI didn’t mean that “all concerns about safety go out the window.”

“But focus matters, and we must focus now on the opportunity to catch lightning in a bottle, unleash our most brilliant innovators and use AI to improve the well-being of our nations and their peoples with great confidence. I can say it is an opportunity that the Trump administration will not squander.”

One way the new US government will do that is by making sure US schools teach students “how to manage, how to supervise and how to interact” with AI-enabled tools as they increasingly become a part of everyday life, Vance added.

French President Emmanuel Macron speaks with CNN in Paris on Thursday.
CNN

Related article Europe ‘not in the AI race today,’ French President Macron tells CNN

“AI, I really believe, will … make people more productive. It is not going to replace human beings,” he said.

Vance warned against regulation that “strangles” AI, calling for more “optimism” about the technology on Europe’s part in particular.

The European Union is home to a landmark new law on AI, which imposes blanket bans on some “unacceptable” uses of the technology while enacting stiff guardrails for other applications deemed “high-risk.” For instance, the EU AI Act outlaws any biometric-based tools used to guess a person’s race, political leanings or sexual orientation.

Vance also mentioned other touchstones of Trump’s administration, saying that, “to safeguard America’s advantage,” it will ensure that the most powerful AI systems are built in the US with American-designed and manufactured chips.

The new government will also ensure that AI systems developed in America are free from “ideological bias and never restrict our citizens’ right to free speech,” he said.

Annoa Abekah-Mensah contributed to this report.

免責聲明:投資有風險,本文並非投資建議,以上內容不應被視為任何金融產品的購買或出售要約、建議或邀請,作者或其他用戶的任何相關討論、評論或帖子也不應被視為此類內容。本文僅供一般參考,不考慮您的個人投資目標、財務狀況或需求。TTM對信息的準確性和完整性不承擔任何責任或保證,投資者應自行研究並在投資前尋求專業建議。

熱議股票

  1. 1
     
     
     
     
  2. 2
     
     
     
     
  3. 3
     
     
     
     
  4. 4
     
     
     
     
  5. 5
     
     
     
     
  6. 6
     
     
     
     
  7. 7
     
     
     
     
  8. 8
     
     
     
     
  9. 9
     
     
     
     
  10. 10