dxu comments on The path of the rationalist - Less Wrong
You are viewing a comment permalink. View the original post to see all comments and the full post content.
You are viewing a comment permalink. View the original post to see all comments and the full post content.
Comments (30)
Nobody asked me to take either vow. Doing so isn't in the spirit of this community. The only reason someone might vow is to create a precommitment. You didn't make any decent argument about why this is a case where a precommitment is useful.
There nothing to be won by using imprecise language when one wants to teach clear thinking. Simplicity is a virtue.
There nothing wrong with willing reality to be different. It leads to actions that change reality.
Rationality also is about winning. There are cases where the truth isn't the most important thing.
This whole example suffers from what Nassim Taleb calls the ludic fallacy. Balls in urns do have fixed probabilities but in most cases in life we don't have probabilities that are known in the same way.
I believe that's why So8res referred to it as a vow to yourself, not anyone else. Also note that this is a series of posts meant to introduce people to Rationality: AI to Zombies, not "this community" (by which I assume you mean LW).
This seems like a willful misreading of the essay's point. It seems obvious from context that So8res is referring here to motivated cognition, which does indeed have something wrong with it.
Before I also haven't heard anybody speak about taking those kinds of vows to oneself.
I consider basics to be important. If we allow vague statements about basic principles of rationality to stand we don't improve our understanding of rationality.
Willing is not the problem of motivated cognition. Having desires for reality to be different is not the problem. You don't need to be a straw vulcan without any desire or will to be rational.
Furthermore "Shut up and do the impossible" from the sequences is about "trying to will reality into being a certain way".
It's not literal. It's an attempt at poetic language, like The Twelve Virtues of Rationality.
I think the "The Twelve Virtues of Rationality" actually makes an argument that those things are virtues.
It's start is also quite fitting: "The first virtue is curiosity. A burning itch to know is higher than a solemn vow to pursue truth."
It argues against the frame of vows.
Withdrawing into mysticism where everything goes is bad. Obfuscating is bad. It's quite easy to say something that gives rationalist applause lights. Critical thinking and actually thinking through the implications of using the frame of a vow is harder. Getting less wrong about what it happens to think rational is hard.
Mystic writing that's too vague to be questioned doesn't really have a place here.
Sure, I agree with all of that. I was just trying to get at the root of why "nobody asked [you] to take either vow".
The fact that I haven't taken a literal vow is true, but they meaning of what I was saying goes beyond that point.
The root is that nobody asked me in a metaphorical way to take a vow either. Eliezer asked for curiosity instead of a solemn vow in the talk about rationalist virtues.
There are reasons why that's the case.
Er, yes, someone has. In fact, Eliezer has asked you to do so. From the Twelve Virtues:
This is the exact same thing that the article is saying:
No, it's about actually finding the way to force reality into some state others considered so implausible that they hastily labeled it impossible. Saying, "If the probability isn't 0%, then to me it's as good as 100%!" isn't saying you can defy probability, but merely that you have a lot of information and compute-power. Or it might even just be expressing a lot of emotional confidence for someone else's sake.
(Or that you can solve your problems with giant robots, which is always the awesomer option.)
The sentence "trying to will reality into being a certain way". doesn't say anything about p=0 or defying probability.
This is what is known as "neglecting context". Right after the sentence you originally quoted from the article, we see this:
I'm not quite sure why you're having difficulty understanding this. "Willing reality into a being a certain way", in this context, does not mean desiring to change the world, but rather shifting one's probability estimates toward one's desired conclusion. For example, I have a strong preference that UFAI not be created. However, it would be a mistake for me to then assign a 0.00001% probability to the creation of UFAI purely because I don't want it to be created; the true probability is going to be higher than that. I might work harder to stop the creation of UFAI, which is what you mean by "willing reality", but that is clearly not the meaning the article is using.