Nate Soares told BI that superintelligence could wipe us out if humanity rushes to build it. The AI safety expert said efforts to control AI are failing, and society must halt the "mad race." His new ...
Leaders petitioned that AI could existentially threaten humans. AI pioneers and thousands of others signed the statement. The public is equally concerned about superintelligence. The surprise release ...
(via Kyle Hill) A new book by long-time AI researchers Eliezer Yudkowsky and Nate Soares argues that superintelligence must stop. Now. It’s a conclusion that they didn’t want to come to, but the ...
Every time Thibault publishes a story, you’ll get an alert straight to your inbox! Enter your email By clicking “Sign up”, you agree to receive emails from ...