taw comments on What do you mean by rationalism? - Less Wrong
You are viewing a comment permalink. View the original post to see all comments and the full post content.
You are viewing a comment permalink. View the original post to see all comments and the full post content.
Comments (14)
What if you couldn't distinguish between two different reasoning mechanisms by any finite amount of observation, but they led to completely different conclusions?
The universe in which at some date in the future every paperclip turns into a non-paperclip, and every non-paperclip turns into a paperclip, would look just like the universe where no such thing ever happens.
And there are infinitely many such switching universes - one for each switching date - and only one non-switching universe. So even if they seem unlikely, this should be balanced by their numbers.
Are you willing to take the risk that all your effort to make more paperclips will lead to fewer paperclips because you simply assumed how universe works?
Nice try, but correct reasoning implies a complexity penalty because predicating my reasoning on arbitrary parameters would be filtered out quickly given informative observations.
Is every paperclip just as important, or each additional paperclip matters less?
Is certain number of paperclips exactly as valuable as half the chance for twice as many paperclips?
You're saying "complexity penalty", but it is not that complex to describe 3^^^3 paperclips. Number of possible paperclips can increase a lot lot faster than complexity.