TheOtherDave comments on Welcome to Less Wrong! (2012) - Less Wrong

25 Post author: orthonormal 26 December 2011 10:57PM

You are viewing a comment permalink. View the original post to see all comments and the full post content.

Comments (1430)

You are viewing a single comment's thread. Show more comments above.

Comment author: Raiden 26 February 2012 06:51:48AM 1 point [-]

Thanks for your input, it was quite enlightening. I especially appreciate the Common Sense Atheism post. That's a wonderful blog and what originally led me to this site, but I had no idea that article was on there.

Concerning what you said about the Holocaust and such, that had actually occurred to me before, but in a different manner. I reasoned that even if I felt 99% certain that my moral beliefs were accurate, there was that 1% chance that they could be wrong. Hitler may well have felt 99% certain that he was correct. I became to afraid to really do much of anything. I thought, "What if it is in some weird way the utmost evil to not kill millions of people? It seems unlikely, but it seemed unlikely to Hitler that he was in the wrong. What if somehow similarly it is wrong to try to ascertain what is right? What if rationality is somehow immoral?"

Of course I never actually consciously thought that was true, but I fear my subconscious still believes it. That is my greatest debilitation, that lingering uncertainty. I now consciously hold the idea that it is at least better to try and be right than to not try at all, that it would be better to be Hitler than to be 40 years old and living with my mom, but my subconscious still hasn't accepted that.

I believe that is why I have difficulty integrating rationality. Some part of my mind somewhere says, "But what if this is wrong? What if this is evil? You're only 99.99999999% certain. What if religious fundamentalism is the only moral choice?"

Comment author: TheOtherDave 26 February 2012 06:03:32PM 13 points [-]

It's not a rhetorical question, you know. What happens if you try to answer it?

I have a pill in my hand. I'm .99 confident that, if I take it, it will grant me a thousand units of something valuable. (It doesn't matter for our purposes right now what that unit is. We sometimes call it "utilons" around here, just for the sake of convenient reference.) But there's also a .01 chance that it will instead take away ten thousand utilons. What should I do?

It's called reasoning under uncertainty, and humans aren't very good at it naturally. Personally, my instinct is to either say "well, it's almost certain to have a good effect, so I'll take the pill" or "well, it would be really bad if it had a bad effect, so I won't take the pill", and lots of studies show that which of those I say can be influenced by all kinds of things that really have nothing to do with which choice leaves me better off.

One way to approach problems like this is by calculating expected values. Taking the pill gives me a .99 chance of 1000 utilons, and a .01 chance of -10000 utilons; the expected value is therefore .99 * 1000 - .01 * 10000 = 990 - 100; the result is positive, so I should take the pill. If I instead estimated a .9 chance of upside and a .1 chance of downside, the EV calculation would be 99 - 1000; negative result, so I shouldn't take the pill.

There are weaknesses to that approach, but it has definite advantages relative to the one that's wired into my brain in a lot of cases.

The same principle applies if I estimate a .99 chance that by adopting the ideology in my hand, I will make better choices, and a .01 chance that adopting that ideology will lead me to do evil things instead.

Of course, what that means is that there's a huge difference between being 99% certain and being 99.99999999% certain. It means that there's a huge difference between being mistaken in a way that kills millions of people, and being mistaken in a way that kills ten people. It means that it's not enough to say "that's good" or "that's evil"; I actually have to do the math, which takes effort. That's an offputting proposition; it's far simpler to stick with my instinctive analysis, even if it's less useful.

At some point, the question becomes whether I feel like making that effort.