“Humans + AGI = Harmony?” Episode #32 TRAILER, For Humanity: An AI Risk Podcast.

FOR EDUCATIONAL AND KNOWLEDGE SHARING PURPOSES ONLY. NOT-FOR-PROFIT. SEE COPYRIGHT DISCLAIMER.

Trailer

Interview

In episode 32, of For Humanity Podcast, host John Sherman interviews BiocommAI CEO Peter Jensen. Peter is working on a number of AI-risk related projects. He believes it’s possible Homo sapiens (humans) and Machine intelligence (AGI) can co-exist in Mutualistic symbiosis.

This podcast is not journalism. But it’s not opinion either. This is a long form public service announcement.

This show simply strings together the existing facts and underscores the unthinkable probable outcome, the end of all life on earth.

For Humanity: An AI Safety Podcast, is the accessible AI Safety Podcast for all humans, no tech background required.

Our show focuses solely on the threat of human extinction from AI.

Peabody Award-winning former journalist John Sherman explores the shocking worst-case scenario of artificial intelligence: human extinction.

The makers of AI openly admit it. Their work could kill all humans, in as soon as 2 years. This podcast is solely about the threat of human extinction from AGI.

We’ll meet the heroes and villains, explore the issues and ideas, and what you can do to help save humanity.

Once upon a time… a bedtime AI horror story:

HOW can AI Kill-us-All? So Simple, Even a Child can Understand (1:25)

Still confused? From 40 years ago… 1984. Try this.

In the 21st century a weapon will be invented like no other. This weapon will be powerful, versatile and indestructible. It can’t be reasoned with. It can’t be bargained with. It will feel no pity. No Remorse. No pain. No fear…

FOR EDUCATIONAL AND KNOWLEDGE SHARING PURPOSES ONLY. NOT-FOR-PROFIT. SEE COPYRIGHT DISCLAIMER.

DISCLOSURE: BiocommAI Inc. is a for-profit enterprise for SafeAI technical engineering commons solutions for the AI industry.