All of L._Zoel's Comments + Replies

Does the fact that I'd do absolutely nothing differently mean that I'm already a nihilist?

How many rationalists would retain their belief in reason, if they could accurately visualize that hypothetical world in which there was no rationality and they themselves have become irrational?

2Idan Arye
No rationality, or no Bayesianism? Rationality is a general term for reasoning about reality. Bayesianism is the specific school of rationality advocated on LessWrong. A "world in which there was no rationality" is not even meaningful, just like "world in which there was no physics" is meaningless. Even if energy and matter behaves in a way that's completely alien to us, there are still laws that govern how it works and you can call these laws "physics". Similarly, even if we'd live in some hypothetical world where the rules of reasoning are not derived from Bayes' theorem, there are still rules that can be thought of as that reality's rationalism. A world without Bayesianism is easy to visualize, because we have all seen such worlds in fiction. Cartoons takes this to the extreme - Wile E. Coyote paints a tunnel and expects Road Runner to crash into it - but Road Runner manages to go through. Then he expects that if Road Runner could go through, he could go through as well - but he crashes into it when he tried. Coyote's problem is that his rationalism could have worked in our world - but he is not living in our world. He is living in a cartoon world with cartoon logic, and needs a different kind of rationalism. Like... the one Bugs Bunny uses. Bugs Bunny plugs Elmer Fudd's rifle with his finger. In our world, this could not stop the bullet. But Bugs Bunny is not living in our world - he lives in cartoon world. He correctly predicts that the rifle will explode without harming him, and his belief in that prediction is strong enough to bet his life on it. Now, one may claim that it is not rationality that gets messed up here - merely physics. But in the examples I picked it is not just that laws of nature that don't work like real world dwellers would expect - it is consistency itself that fails. Let us compare with superhero comics, where the limitations of physics are but a suggestion but at least some effort is done to maintain consistency. When mirror maste

That's not the idea that really scares Less Wrong people.

Here's a more disturbing one; try to picture a world where all the rational skills you're learning on Less Wrong are actually somehow flawed, and actually make it less likely that you'll discover the truth or made you correct less often, for whatever reason? What would that look like? Would you be able to tell the difference.

I must say, I have trouble picturing that, but I can't prove it's not true (we are basically tinkering with the way our mind works without a software manual, after all).

5Jotto999
I'm not sure what "no rationality" would mean. Evolutionarily relevant kinds of rationality can still be expected, like preference to sexually fertile mates, fearing spiders/snakes/heights, and if we're still talking about something at all similar to Homo Sapiens, language and cultural learning and such, which require some amounts of rationality to use. I wonder if you might be imagining rationality in the form of essentialism, allowing you to universally turn the attribute off, but in reality there no such off switch that is compatible with having decision making agents.

if they could accurately visualize that hypothetical world in which there was no rationality and they themselves have become irrational?

I just attempted to visualize such a world, and my mind ran into a brick wall. I can easily imagine a world in which I am not perfectly rational (and in fact am barely rational at all), and that world looks a lot like this world. But I can't imagine a world in which rationality doesn't exist, except as a world in which no decision-making entities exist. Because in any world in which there exist better and worse options and an entity that can model those options and choose between them with better than random chance, there exists a certain amount of rationality.

1[anonymous]
I don't know. But I would. Irrationality is caused by ignorance, so there will always be tangent worlds (while regarding this current one as prime) in which I give up. There will always be a world where anything that is physically possible occurs. (and probably many where even that requirement doesn't hold) To put it another way, there has been a moment in time when I was not rational. Is that reason to give up rationality forever? Time could be just another dimension, it's manipulation as far out of our grasp as that of other possible worlds.