
“The most likely result of building a superhumanly smart AI, under anything remotely like the current circumstances, is that literally everyone on Earth will die.” – Eliezer Yudkowsky, Time Magazine, March 29, 2023

No, the above quote isn’t pulled from an Orwell science fiction novel or stated by the supercomputer AM in Harlan Ellis
...









