Letters
Existential Risk? I Don't Get It!: Prominent computer scientists fear that AI could trigger human extinction. It's time to have a real conversation about the realistic risks.
Last week, safe.org asserted that “Mitigating the risk of extinction from AI should be a global priority alongside other societal-scale risks such as pandemics and nuclear war.