JoshuaZ comments on Query the LessWrong Hivemind - Less Wrong
You are viewing a comment permalink. View the original post to see all comments and the full post content.
You are viewing a comment permalink. View the original post to see all comments and the full post content.
Comments (89)
P(Simulation) < 0.01; little evidence in favor of it and it requires that there is some other intelligence doing the simulation, that there can be the kind of fault-tolerant hardware that can (flawlessly) compute the universe. I don't think posthuman ancestors are capable of running a universe as a simulation. I think Bostrom's simulation argument is sound.
1 - P(Solipsism) > 0.999; My mind doesn't contain minds that are consistently smarter than I am and can out-think me on every level.
P(Dreaming) < 0.001; We don't dream of meticulously filling out tax forms and doing the dishes.
[ Probabilities are not discounted for expecting to come into contact with additional evidence or arguments ]
Given your argument, I'm a bit confused by why you assign such a high upper bound to P(Solipsism).
Ah, you're right. Thanks for the correction.
I edited the post above. I intended P(Solipsism) < 0.001
And now I think a bit more about it I realize the arguments I gave are probably not "my true objections". They are mostly appeals to (my) intuition.