FOR EDUCATIONAL AND KNOWLEDGE SHARING PURPOSES ONLY. NOT-FOR-PROFIT. SEE COPYRIGHT DISCLAIMER.

Nate Soares explains why building superintelligence means human extinction. Chapters 0:00 – The Next AI Advance Could End Badly 0:49 – Why Smart AI Could Kill Us 2:16 – The Black Box Problem 3:36 – When AI Learns the Wrong Goal 5:22 – Lab vs Deployment 6:43 – No Retries 7:43 – Why Secure AI Is Impossible 10:21 – How Superintelligence Could Persuade Us 12:30 – The Alchemy Stage 14:37

Stop the Race Original Videos:

• Why Building Superintelligence Means Human…    

• Will AI Kill Us All? Nate Soares on His Co…  

FOR EDUCATIONAL AND KNOWLEDGE SHARING PURPOSES ONLY. NOT-FOR-PROFIT. SEE COPYRIGHT DISCLAIMER.