Computerworld May 30, 2023
A second open letter warning of the threats posed by artificial intelligence was signed by many of its most prominent creators, who called controlling it ‘a global priority alongside other societal-scale risks such as pandemics and nuclear war.’
Hundreds of tech industry leaders, academics, and others public figures signed an open letter warning that artificial intelligence (AI) evolution could lead to an extinction event and saying that controlling the tech should be a top global priority.
“Mitigating the risk of extinction from AI should be a global priority alongside other societal-scale risks such as pandemics and nuclear war,” read the statement published by San Francisco-based Center for AI Safety.
The brief statement in the letter reads almost like a mea...