All of gabo96's Comments + Replies

Even if the alien civilization isn't benevolent, they would probably have more than enough selfish reasons to prevent a superintelligence from appearing on another planet.

So the question is whether they would be technologically advanced enough to arrive here in 5, 10, or 20 years or whatever time we have left until AGI

An advanced civilization that isn't a superintelligence itself that's advanced enough would probably have faced an AI extinction scenario and succeeded, so they would probably stand a much higher chance of aligning AI than ourselves. But prev... (read more)

I'd like to add another question: 

Why aren't we more concerned about s-risk than x-risk? 

Given that virtually everyone would prefer dying rather than facing an indefinite amount of suffering for an indefinite amount of time, I don't understand why more people are asking this question.

3Eli Tyre
There's actually pretty large differences of perspective on this claim.