Annoyance comments on Intelligence enhancement as existential risk mitigation - Less Wrong
You are viewing a comment permalink. View the original post to see all comments and the full post content.
You are viewing a comment permalink. View the original post to see all comments and the full post content.
Comments (198)
That's not quite what I meant, but I do believe Eliezer has advised anyone not working on 'risks' (or FAI?) to make as much money as they can and contribute to an organization that is.
What I meant was that given a money pump, the straightforward thing to do is to pump it, not fix it in the hope that it will somehow benefit humanity.
It seems to me that on LW, most believe that people are irrational in ways that should make people money pumps, but the reaction to this is to make extreme efforts to persuade people of things.
Improving someone's intelligence or rationality is difficult if they're not already looking, but channeling away some of their funds or political capital will lessen the impact their irrationality can have.