Discussion about this post

User's avatar
MatthewK's avatar

>So, no progress in medicine or AI until the remotest threat of misaligned superintelligence or engineered superviruses is gone. Actually, no, we need to go further. All technology should be positively dismantled — think of the small but nonzero risk of runaway climate change causing extinction.

I’ve heard that the reason climate change might cause extinctions is by causing global instability that leads to people being more reckless and violent. People being poor and sick is not good for stability. Ditching technology would be much worse for stability than some more warming.

Expand full comment
Parmest Roy's avatar

While studying longtermism, I never fully came to terms with the idea of placing equal values to a life right now and a life in 10,000 CE. And one reason I had was that while it is possible to predict the outcome of our actions with certainty right now, it is impossible to predict the outcome of our actions around 8000 years later (thanks to chaos). This suggests that the more distant in future a life is, the less likely we are to positively affect it. Hence, our efforts are much better spent and valued in trying to positively affect the present people. Though I couldn't find a lot written on this issue, the idea of two competing infinities in this post, I think, does a marvelous job at getting to the core of the problem.

Expand full comment
4 more comments...

No posts