shokwave comments on Doublethink (Choosing to be Biased) - Less Wrong

33 Post author: Eliezer_Yudkowsky 14 September 2007 08:05PM

You are viewing a comment permalink. View the original post to see all comments and the full post content.

Comments (161)

Sort By: Old

You are viewing a single comment's thread. Show more comments above.

Comment author: [deleted] 08 December 2010 03:21:07AM 5 points [-]

I'm through with truth.

I never had a scientific intuition. In college, I once saw a physics demonstration with a cathode ray tube -- moving a magnet bent the beam of light that showed the path of the electrons. I had never seen electrons before and it occurred to me that I had never really believed in the equations in my physics book; I knew they were the right answers to give on tests, but I wouldn't have expected to see them work.

I'm also missing the ability to estimate. Draw a line on a sheet of paper; put a dot where 75% is. Then check if you got it right. I always get that sort of thing wrong. Arithmetic estimation is even harder. Deciding how to bet in a betting game? Next to impossible.

Whatever mechanism is that matches theory to reality, mine doesn't work very well. Whatever mechanism derives expectations about the world from probability numbers, mine hardly works at all. This is why I actually can double-think. I can see an idea as logical without believing in it.

A literate person cannot look at a sentence without reading it. But a small child, just learning to read, can look at letters on a page without reading, and has to make an extra effort to read them. In the same way, a bad rationalist can see that an idea is true, without believing it. I can read about electromagnetism and still not expect to see the beam in the cathode ray tube bend. I spent ten years or so thinking "Isn't it odd that the best arguments are on the atheist side?" without once wondering whether I should be an atheist.

Should I break down that barrier? I'm not sure. I'd do it if it would allow me to make money, I think. But not if it came at the cost of some kind of screaming Cthulhu horror.

You know what I really wish I had? Team spirit. Absolute group loyalty. Faith. Patriotism. The sense of being in the right. In Hoc Signo Vinces. I have fleeting glimpses of it but it doesn't last. I want it enough that I keep fantasizing about joining the Army because it might work. I always wanted to be a fanatic, and my brain would never do it. But I'm starting to wonder if that's hackable; I'm sure enough sleep deprivation and ritual would do it.

Comment author: shokwave 08 December 2010 05:53:42AM *  2 points [-]

Should I break down that barrier? I'm not sure. I'd do it if it would allow me to make money, I think. But not if it came at the cost of some kind of screaming Cthulhu horror.

Not to other-optimise, but yes.

As far as I can tell, the chances of encountering a true idea that is also a Lovecraftian cosmic horror is below the vanishing point for human brains. (There aren't neurons small enough to accurately reflect the tiny chances, etc)

It will also help you make money. Example: I received a promotion for demonstrating my ability to make more efficient rosters. This ability came from googling "scheduling problem" and looking at some common solutions, recognising that GRASP-type (page 7) solutions were effective and probably human-brain-computable - and then when I tried rostering, I intuitively implemented a pseduo-GRASP method.

That "intuitively implemented" bit is really important. You might not realise how much you rely on your intuition to decide for you, but it's a lot. It sounds like taking a lot of theory and jamming it into your intuition is the hard part for you.

Tangentially, how do you feel about the wisdom of age and the value of experience in making decisions?

Comment author: [deleted] 08 December 2010 12:58:48PM 1 point [-]

I think wisdom and experience are pretty good things -- not sure how that relates though.

And "screaming Cthulhu horror" was just a cute phrase -- I don't literally believe in Lovecraft. I just mean "if rationality results in extreme misery, I'll take a pass."

Comment author: shokwave 08 December 2010 03:10:45PM 0 points [-]

I think wisdom and experience are pretty good things -- not sure how that relates though.

Some people I have encountered struggle with my rationality because I often privilege general laws derived from decision theory and statistics over my own personal experience - like playing tit-for-tat when my gut is screaming defection rock, or participating in mutual fantasising about lottery wins but refusing to buy 'even one' lottery ticket. I have found that certain attitudes towards experience and age-wisdom can affect a person's ability to tag ideas with 'true in the real world' - that reason and logic can only achieve 'true but not actually applicable in the real world'. It was a possibility I thought I should check.

And "screaming Cthulhu horror" was just a cute phrase -- I don't literally believe in Lovecraft.

I assumed it was a reference to concepts like Roko's idea. As for regular extreme misery, yes, there is a case for rationality being negative. You would probably need some irrational beliefs (that you refuse to rationally examine) that prevent you from taking paths where rationality produces misery. You could probably get a half-decent picture of what paths these might be from questioning LessWrong about it, but that only reduces the chance - still a consideration.