TheOtherDave comments on Welcome to Less Wrong! (2012) - Less Wrong

25 Post author: orthonormal 26 December 2011 10:57PM

You are viewing a comment permalink. View the original post to see all comments and the full post content.

Comments (1430)

You are viewing a single comment's thread. Show more comments above.

Comment author: anotherblackhat 03 March 2012 04:16:02AM 0 points [-]

I assume you meant "more in the same vein" rather than simply "again".

I perceive this as a difference between myself and the group because of the large numbers of posts I've read that say rationalists should believe what is true, and not believe what is false. The sentiment "that which can be destroyed by truth should be" is repeated several times in several different places. My memory is far from perfect, but I don't recall any arguments in favor of lies. You claim most rationalists in favor of "white lies"?
I didn't get that from my reading.
But then I've only started in on the site, it will probably take me weeks to absorb a significant part of it, so if someone can give me a pointer, I'd be grateful.

I am much more inclined to go along with the "rationalists should win" line of thought. I want to believe whatever is useful. For example, I believe that it's impossible to simulate intelligence without being intelligent. I've thought about it, and I have reasons for that belief, but I can't prove it's true, and I don't care. "knowing" that it's impossible to simulate intelligence without being intelligent lets me look at the Chinese Room Argument and conclude instantly that it's wrong. It's useful to believe that simulated intelligence requires actual intelligence. If you want me to stop believing, you need only show me the lie in the belief. But if you want me to evangelize the truth, you'd need to show me the harm in the lie as well.

Santa Clause isn't a white lie. Santa Clause is a massive conspiracy, a gigantic cover up perpetrated by millions of adults. Lies on top of lies, with corporations getting in on the action to sell products, a lie that when discovered leaves children shattered, their confidence in the world shaken. And yet, it increases the amount of joy in the world by a noticable amount. It brings families together, it teaches us to be caring and giving. YMMV of course, but many would consider christmas utilons > christmas evilons.

Most importantly, Santa persists. People make mistakes, but natural selection removes really bad mistakes from the meme pool. As a rule of thumb, things that people actually do are far more likely to be good for them than bad, or at least, not harmful. I believe that's a large part of why when theory says X, and intuition says Y, we look very long and hard before accepting that theory as correct. Our intuitions aren't always correct, but they are usually correct. There are some lies we believe intuitively. In the court of opinion, I believe they should be presumed good until proven harmful.

Comment author: TheOtherDave 03 March 2012 05:35:34AM 0 points [-]

Well, choosing to believe lies that are widely believed is certainly convenient, in that it does not put me at risk of conflict with my tribe, does not require me to put in the effort of believing one thing while asserting belief in another to avoid such conflict, and does not require me to put in the effort of carefully evaluating those beliefs.

Whether it's useful -- that is, whether believing a popular lie leaves me better off in the long run than failing to believe it -- I'm not so sure. For example, can you clarify how your belief about the impossibility of simulating intelligence with an unintelligent system, supposing it's false, leaves you better off than if you knew the truth?

Comment author: anotherblackhat 03 March 2012 05:35:40PM *  0 points [-]

O.k. suppose It's false. Rather than wasting time disproving the CRA, I simply act on my "false" belief and reject it out of hand. Since the CRA is invalid for many other reasons as well, I'm still right. Win.

Generalizing; Say I have an approximation that usually gives me the right answer, but on rare occasion gives a wrong one. If I work through a much more complicated method, I can arrive at the correct answer. I believe the approximation is correct. As long as;
effort involved in complicated method > cost of being wrong
I'm better off not using it. If I knew the truth, then I could still use the approximation, but I now have an extra step in my thinking. Instead of;
1. Approximate.
2. Reject.
it's
1. Approximate.
2. Ignore possibility of being wrong.
3. Reject.

Comment author: TheOtherDave 03 March 2012 09:06:58PM 0 points [-]

Ah, I see what you mean. Sure, agreed: as long as the false beliefs I arrive at using method A, which I would have avoided using method B, cost me less to hold than the additional costs of B, I do better with method A despite holding more false beliefs. And, sure, if the majority of false-belief-generating methods have this property, then it follows that I do well to adopt false-belief-generating methods as a matter of policy.

I don't think that's true of the world, but I also don't think I can convince you of that if your experience of the world hasn't already done so.

I'm reminded of a girl I dated in college who had a favorite card trick: she would ask someone to pick a card, then say "Is your card the King of Clubs?" She was usually wrong, of course, but she figured that when she was right it would be really impressive.