gwern comments on Journal of Consciousness Studies issue on the Singularity - Less Wrong
You are viewing a comment permalink. View the original post to see all comments and the full post content.
You are viewing a comment permalink. View the original post to see all comments and the full post content.
Comments (85)
I am aware of that line of reasoning and reject it. Each person has about a 1 in 12000 chance of having an unruptured aneurysm in the brain that could be detected and then treated after having a virtually risk free magnetic resonance angiography. Given the utility you likely assign to your own life it would be rational to undergo such a screening. At least it would make much more sense than signing up for cryonics. Yet you don't do it, do you?
There are literally thousands of activities that are rational given their associated utilities. But that line of reasoning, although technically correct, is completely useless because 1) you can't really calculate shit 2) it's impossible to do for any agent that isn't computationally unbounded 3) you'll just end up to sprinkle enough mathematics and logic over your fantasies to give them a veneer of respectability.
Expected utility maximization in combination with consequentialism is the ultimate recipe for extreme and absurd decisions and actions. People on lesswrong are fooling themselves by using formalized methods to evaluate informal evidence and pushing the use of intuition onto a lower level.
The right thing to do is to use the absurdity heuristic and discount crazy ideas that are merely possible but can't be evaluated due to a lack of data.
My understanding from long-past reading of elective whole-body MRIs was that they were basically the perfect example of iatrogenics & how knowing about something can harm you / the danger of testing. What makes your example different?
(Note there is no such possible danger from cryonics: you're already 'dead'.)