The End Of The World with Josh Clark

We humans could have a bright future ahead of us that lasts billions of years. But we have to survive the next 200 years first. Join Josh Clark of Stuff You Should Know for a 10-episode deep dive that explores the future of humanity and finds dangers we have never encountered before lurking just ahead. And if we humans are alone in the universe, if we don't survive intelligent life dies out with us too.

https://www.iheart.com/podcast/105-the-end-of-the-world-with-30006093/

Eine durchschnittliche Folge dieses Podcasts dauert 42m. Bisher sind 12 Folge(n) erschienen. .

Gesamtlänge aller Episoden: 7 hours 49 minutes

subscribe
share






  • 1
  • 2
  • 1
  • 2

Trailer

[transcript]


We humans could have a bright future ahead of us that lasts billions of years. But we have to survive the next 200 years first.


share








 October 17, 2018  2m
 
 

Trailer 2: Bill, Elon and Stephen

[transcript]


Why are smart people warning us about artificial intelligence?


share








 October 24, 2018  1m
 
 

episode 1: Fermi Paradox

[transcript]


Ever wondered where all the aliens are? It’s actually very weird that, as big and old as the universe is, we seem to be the only intelligent life. In this episode, Josh examines the Fermi paradox, and what it says about humanity’s place in the universe.


share








 November 7, 2018  36m
 
 

episode 2: Great Filter

[transcript]


The Great Filter hypothesis says we’re alone in the universe because the process of evolution contains some filter that prevents life from spreading into the universe. Have we passed it or is it in our future? Humanity’s survival may depend on the answer.


share








 November 7, 2018  43m
 
 

episode 3: X Risks

[transcript]


Humanity could have a future billions of years long – or we might not make it past the next century. If we have a trip through the Great Filter ahead of us, then we appear to be entering it now. It looks like existential risks will be our filter.


share








 November 7, 2018  39m
 
 

episode 4: Natural Risks

[transcript]


Humans have faced existential risks since our species was born. Because we are Earthbound, what happens to Earth happens to us. Josh points out that there’s a lot that can happen to Earth - like gamma ray bursts, supernovae, and runaway greenhouse effect.


share








 November 14, 2018  37m
 
 

episode 5: Artificial Intelligence

[transcript]


An artificial intelligence capable of improving itself runs the risk of growing intelligent beyond any human capacity and outside of our control. Josh explains why a superintelligent AI that we haven’t planned for would be extremely bad for humankind.


share








 November 16, 2018  41m
 
 

episode 6: Biotechnology

[transcript]


Natural viruses and bacteria can be deadly enough; the 1918 Spanish Flu killed 50 million people in four months. But risky new research, carried out in an unknown number of labs around the world, are creating even more dangerous humanmade pathogens.


share








 November 21, 2018  57m
 
 

episode 7: Physics Experiments

[transcript]


Surprisingly the field of particle physics poses a handful of existential threats, not just for us humans, but for everything alive on Earth – and in some cases, the entire universe. Poking around on the frontier of scientific understanding has its risks.


share








 November 23, 2018  1h13m
 
 

episode 8: Embracing Catastrophe

[transcript]


We humans are our own worst enemies when it comes to what it will take to deal with existential risks. We are loaded with cognitive biases, can’t coordinate on a global scale, and see future generations as freeloaders. Seriously, are we going to survive?


share








 November 28, 2018  46m
 
 
  • 1
  • 2
  • 1
  • 2