Save Big on Coursera Plus. 7,000+ courses at $160 off. Limited Time Only!
Grab it
Explore a thought-provoking analysis of the potential risks associated with uncontrollable super intelligence (USI) in this 23-minute video. Delve into the "Doomer" argument, examining how USI could pose an existential threat to humanity. Investigate concepts such as split half consistency, international cooperation challenges, bioweapons, terminal race conditions, and the window of conflict. Consider the role of human morality, potential machine wars, and cyberpunk scenarios. Gain a deeper understanding of the complex issues surrounding artificial intelligence and its potential impact on our future.
Steelmanning the Doomer Argument: How Uncontrollable Super Intelligence Could Kill Everyone