AnnaSalamon comments on Goals for which Less Wrong does (and doesn't) help - Less Wrong

57 Post author: AnnaSalamon 18 November 2010 10:37PM

You are viewing a comment permalink. View the original post to see all comments and the full post content.

Comments (101)

You are viewing a single comment's thread. Show more comments above.

Comment author: AnnaSalamon 19 November 2010 12:51:44PM 9 points [-]

There are some distinctions to be made here. Cryonics obviously provides a better chance to see the future after dying than rotting six feet under.

Yes, but it is less obvious that the chance is large enough to be worth the money.

Regarding retirement investment, just ask your parents or grandparents.

My example was that the type of investments that can be relied upon in coming decades is not accessible that way. For example, many in the US trusted their savings in real estate, which had been trustworthy for generations. And then it wasn’t.

Yet this argument against the necessity of empirical data breaks down at some point. Shaping the Singularity is not on par with having a positive impact on the distant poor. If you claim that predictions and falsifiability are unrelated concepts, that's fine. But to believe some predictions - e.g. a technological Singularity spawned by AGI-seeds capable of superhuman recursive self-improvement - compared to other predictions - e.g. a retirement plan for old age - is not the same.

Yes, there is a difference of degree between the difficulty of figuring out whether mortage-backed securities were as trustworthy as people thought (note that this was less obvious beforehand) and the difficulty of thinking non-nonsensically about the impacts of AI on our future. Nonetheless, they both seem sufficiently difficult that their practice is helped by the explicit study of rationality (so that e.g. heuristics and biases, and probability theory, are explicitly studied by many in finance).

Comment author: XiXiDu 19 November 2010 04:40:23PM 2 points [-]

What I said was rather meant to show that there are obvious reasons for which you might want to care about your retirement plan. What's very different is to predict that you shouldn't care about your retirement plan because either we'll be killed by superhuman AGI or join utopia as immortals. Less Wrong seems to be focused on the predictive nature of probability theory and to take ideas serious. I don't think that this is a good approach for any but the most intelligent and educated individuals and organisations. The traditional approach to rely on empirical data and the judgement of experts is to be favored for most people in my opinion.

Comment author: wedrifid 19 November 2010 06:17:04PM *  4 points [-]

Less Wrong seems to be focused on the predictive nature of probability theory and to take ideas serious. I don't think that this is a good approach for any but the most intelligent and educated individuals and organisations. The traditional approach to rely on empirical data and the judgement of experts is to be favored for most people in my opinion.

It is interesting to note that the latter is actually a form of the former, and a particularly strong one at that! In fact, the reasoning that you are using here is using the predictive nature of probability theory.

(That said, I agree that for most people biting the bullet when it comes to their abstract cognitions would be disastrous. Explotions, martyrs and deaths by stoning would abound. That and people would actually act on terrible dating advice rather than their instincts.)