Computerworld May 30, 2023
Lucas Mearian

A second open letter warning of the threats posed by artificial intelligence was signed by many of its most prominent creators, who called controlling it ‘a global priority alongside other societal-scale risks such as pandemics and nuclear war.’

Hundreds of tech industry leaders, academics, and others public figures signed an open letter warning that artificial intelligence (AI) evolution could lead to an extinction event and saying that controlling the tech should be a top global priority.

“Mitigating the risk of extinction from AI should be a global priority alongside other societal-scale risks such as pandemics and nuclear war,” read the statement published by San Francisco-based Center for AI Safety.

The brief statement in the letter reads almost like a mea...

Today's Sponsors

LEK
ZeOmega

Today's Sponsor

LEK

 
Topics: AI (Artificial Intelligence), Healthcare System, Safety, Technology
Why AI Won’t Replace Human Psychotherapists
Mistral unleashes Pixtral Large and upgrades Le Chat into full-on ChatGPT competitor
Cloudian HyperStore Meets Nvidia GPUDirect: Object Storage For AI
Meet The New Boss: Artificial Intelligence
Cerebras Now The Fastest LLM Inference Processor; Its Not Even Close.

Share This Article