G0W51

Wiki Contributions

Comments

Sorted by
G0W5100

Panexistential risk is a good, intuitive, name.

G0W5100

True. Also, the Great Filter is more akin to an existential catastrophe than to existential risk, that is, the risk of an existential catastrophe.

G0W5140

Is there a term for a generalization of existential risk that includes the extinction of alien intelligences or the drastic decrease of their potential? Existential risk, that is, the extinction of Earth-originating intelligent life or the drastic decrease of its potential, does not sound nearly as harmful if there are alien civilizations that become sufficiently advanced in place of Earth-originating life. However, an existential risk sounds far more harmful if it compromises all intelligent life in the universe, or if there is no other intelligent life in the universe to begin with. Perhaps this would make physics experiments more concerning than other existential risks, as even if their chance of causing the extincion of Earth-originating life is much smaller than other existential risks, their chance of eliminating all life in the universe may be higher.

G0W5100

That sounds about right.

G0W5100

It's later, but, unless I am mistaken, the arrival of the intelligence explosion isn't that much later than when most people will retire, so I don't think that fully explains it.

G0W5100

People could vote for government officials who have FAI research on their agenda, but currently, I think few if any politicians even know what FAI is. Why is that?

G0W5100

Why do people spend much, much more time worrying about their retirement plans than the intelligence explosion if they are a similar distance in the future? I understand that people spend less time worrying about the intelligence explosion than what would be socially optimal because the vast majority of its benefits will be in the very far future, which people care little about. However, it seems probable that the intelligence explosion will still have a substantial effect on many people in the near-ish future (within the next 100 years). Yet, hardly anyone worries about it. Why?

G0W5100

I would like to improve my instrumental rationality and improve my epistemic rationality as a means to do so. Currently, my main goal is to obtain useful knowledge (mainly in college) in order to obtain resources (mainly money). I'm not entirely sure what I want to do after that, but whatever it is, resources will probably be useful for it.

G0W5100

Improving my rationality. Are you looking for something more specific?

G0W5100

How much should you use LW, and how? Should you consistently read the articles on Main? What about discussion? What about the comments? Or should a more case-by-case system be used?

Load More