Yvain comments on The Benefits of Rationality? - Less Wrong

18 Post author: cousin_it 31 March 2009 11:17AM

You are viewing a comment permalink. View the original post to see all comments and the full post content.

Comments (76)

You are viewing a single comment's thread. Show more comments above.

Comment author: Yvain 31 March 2009 08:07:42PM 6 points [-]

Why is that bad?

It's not, if you know you're doing it.

Are you sure that this isn't all about signaling being a truth-seeker?

Pretty sure. If I wanted to signal, I'd be a lot more high-falutin about it. Actually, my comments do sound a bit high-falutin' (I was looking for a better word than "truth seeker", but couldn't find one) but that wasn't exactly what I wanted to express. The untangling-wires metaphor works a little better. Nominull's "I only seek to be right because I hate being wrong." works too. It's less of a "I vow to follow the pure light of Truth though it lead me to the very pits of Hell" and more of an "Aaargh, my brain feels so muddled right now, how do I clear this up?"

Also, this would be a terrible community to signal truth-seeking in, considering how entrenched the "rationality as win" metaphor is. As I mentioned in the hair example, I think a lot more people here are signaling a burning interest in real-world application than really have one.

So, if you're saying we should seek truth just because it's the truth, and not because it brings practical benefit or pleasure or sends good signals, then what is the use of seeking truth?

Um...this line of argument applies to everything, doesn't it? What is the use of seeking money, if it doesn't bring pleasure or send good signals? What is the use of seeking love, if it doesn't bring pleasure or send good signals? What is the use of seeking 'practical benefits', if they don't bring pleasure or send good signals?

Darned if I know. That's the way my utility function works. And it certainly is mediated by pleasure and good signals, but I prefer not to say it's about pleasure and good signals because I'd rather not be turned into orgasmium just yet.

Comment author: ciphergoth 31 March 2009 09:34:58PM *  3 points [-]

Yeah, "rationalists WIN!" is the most widely misued EYism on all of LessWrong.com.

Comment author: Demosthenes 31 March 2009 10:35:00PM 2 points [-]

Yvain:

Do you really believe that you engage in Truth-Seeking for utilitarian reasons? I get the impression that you don't really believe that.

Would you be willing to to enter a computer simulation where you got to investigate higher math puzzles (or metaphysics) with no applications? Spend your days in a fantastic and never-ending Truth-Seeking project (we'll throw great sex, food and housing into the holodeck for you as well)?

I liked this better at the beginning when you were prodding people who say that they see rationalism as a means to an end! You seem to be going back to consequentialism!

I don't believe that rationalists WIN because I don't believe that winning WINS

Comment author: Nebu 14 April 2009 02:10:50PM 3 points [-]

Would you be willing to to enter a computer simulation where you got to investigate higher math puzzles (or metaphysics) with no applications? Spend your days in a fantastic and never-ending Truth-Seeking project (we'll throw great sex, food and housing into the holodeck for you as well)?

Maybe a few videogames (or other forms of entertainment in addition to sex) and this sounds like a very sweet deal.

Comment author: Demosthenes 31 March 2009 10:38:14PM 1 point [-]

And you must enjoy the signal value you a little bit! You aren't keeping your Less Wrong postings in your diary under lock and key!

Comment author: Demosthenes 01 April 2009 01:19:12PM 1 point [-]

logi:

That's possible and probably partially accurate; if there were more posts taking the form "I believe X because..." on Less Wrong, I might be more open to the idea that people are doing that.

Ciphergoth:

Also, this would be a terrible community to signal truth-seeking in, considering how entrenched the "rationality as win" metaphor is. As I mentioned in the hair example, I think a lot more people here are signaling a burning interest in real-world application than really have one.

I just wanted to get Yvain's opinion about how much value from posting on Less Wrong was coming from signaling. Yvain suggested that this was not his or her main goal and that LW would be a uniquely poor place to attempt it. I personally doubt both of those points, but I was hoping to get some clarification since the comments about signaling and the nature of truth-seeking don't seem to be part of a system of beliefs.

Are you worried that signaling truth-seeking is legitimate enough?

Comment author: ciphergoth 01 April 2009 08:16:19AM 1 point [-]

Sure, but it's pretty clear that a lot of people are enjoying the WIN! signal too. Let's try not to get too caught up in who is signalling what.

Comment author: loqi 01 April 2009 02:56:27AM 1 point [-]

Even if he did not value the signal, surely you can conjecture a rational strategy of publishing beliefs in order to refine them.

Comment author: igoresque 31 March 2009 08:39:49PM *  2 points [-]

There the danger doesn't seem to be getting something that isn't the truth, the danger is stopping at something that's just true enough for a certain purpose, and no more.

Why is that bad?

It's not, if you know you're doing it.

This is an interesting debate. I believe all the truth we'll ever get will be like the tube map: good for purpose X, and no more. Or at least, bad for purpose Y. Wanting more is surrendering to metaphysics, realism, platonism, absolutism - whatever you wish to call it.

I believe platonism shaped first the Hellenistic world, then christianity (Paul was of Greek culture, the whole new testament was written in Greek, and books like the one of John are soaked in primary platonic philosophy), and rules until today. It also really sucks. Because it makes people to not want to be less wrong. They want to be completely, absolutely right, in a way you can never claim with the help of mere rationality. Only delusion can help with that.

The Truth Pilgrim's progress goes like this:

Slightly Rational -> Less Wrong -> Delusional

Comment author: pjeby 14 April 2009 03:53:57PM *  1 point [-]

Wanting more is surrendering to metaphysics, realism, platonism, absolutism - whatever you wish to call it. ....

Because it makes people to not want to be less wrong. They want to be completely, absolutely right, in a way you can never claim with the help of mere rationality. Only delusion can help with that.

The Truth Pilgrim's progress goes like this:

Slightly Rational -> Less Wrong -> Delusional

Yep -- and that's probably as close to an "absolute truth" as you can get. Robert Anton Wilson's "Quantum Psychology" (bad title, awesome book, some parts approach GEB in awesomeness) has some very good information along these lines, along with lots of "class exercises" that might be useful for developing an instrumental rationality group.

Comment author: thomblake 02 April 2009 10:50:51PM 1 point [-]

Good point! Though inasmuch as one can see the history of ideas as a conflict between Plato and Aristotle (not an entirely fruitless endeavor) it's worth noting that Aristotle is still alive and kicking.

Comment author: ciphergoth 31 March 2009 10:28:04PM *  1 point [-]

I inherently value humanity's success in understanding as much as we do, but I don't discount the utility much in time; I don't much mind if we learn something later rather than earlier.

As a result, it's not that important to me to try to serve that end directly; I think it's a bigger gain to serve it indirectly, by trying to reduce the probability of extinction in the next hundred years. This also serves several other goals I value.