lfghjkl comments on In praise of gullibility? - Less Wrong

23 Post author: ahbwramc 18 June 2015 04:52AM

You are viewing a comment permalink. View the original post to see all comments and the full post content.

Comments (106)

You are viewing a single comment's thread. Show more comments above.

Comment author: [deleted] 18 June 2015 08:26:43AM 4 points [-]

I've had several experiences similar to what Scott describes, of being trapped between two debaters who both had a convincingness that exceeded my ability to discern truth.

I always feel so.

I see a lot of rational sounding arguments from red-pillers, manosphericals, conservatives, reactionaries, libertarians, the ilk. And then I see the counter-arguments from liberals, feminists, leftists and the ilk that pretty much boil down to the other side just being uncompassionate assholes and desperately rationalizing it with arguments. Well, rationalizing is a very universal feature and they sometimes do seem like really selfish people indeed... so I really don't know who to believe.

Or climate change. What little I know about the scientific method says this is NOT how you do science. You don't just make a computer simulation in 1980 or so that would predict oceans boiling away by 2000 and when it fails to happen just tweak it and say this second time now you surely got it right. Yet, pretty much every prestigious scientist supports the "alarmist" side and on the other side I see only marginal, low-status "cranks" - and they are curiously politically motivated. So who do I support?

In such dilemmas, I think the best thing is to figure out what is it your "corrupted hardware" wants to do and do the opposite - do the opposite what your instincts i.e. evolved biases suggest.

Well, no luck. On one side, I see people who are high-status, intellectual, and look really nice and empathic and compassionate. Of course my instincts like that. On the other side, I see people who look brave, tough, critical-minded and creative, plus they seem to be far more historically literate, so basically NRx and libertarians and similar folks give me that kind of "inventor" vibe, which incidentally is also something my instincts like.

I like both sides - and yet, to decide rationally, I should probably choose something I instinctively dislike.

Comment author: lfghjkl 18 June 2015 10:16:15AM 0 points [-]

In such dilemmas, I think the best thing is to figure out what is it your "corrupted hardware" wants to do and do the opposite - do the opposite what your instincts i.e. evolved biases suggest.

Reversed Stupidity Is Not Intelligence

Comment author: [deleted] 18 June 2015 11:19:24AM 4 points [-]

Instinct != stupidity. This is a different thing here. Leaning towards an idea comes both from finding it true and liking it. If you equally lean towards two ideas, but like one more, that suggests you subconsciously find that less true. So if you go for the one you dislike, you probably go for an idea you find subconsciously more true.Leaning towards an idea you dislike suggests you found so much truth in it, subconsciously, that it even overcame the ugh-field that came from disliking it. And that is a remarkably lot of truth.

Reversed stupidity is a different thing. That is a lot like "Since there is no such thing as Adam and Eve's original sin, human nature cannot have any factory bugs and must be infinitely perfectible." (Age of Enlightenment philosophy.) That is reversed stupidity.

It is a different thing. It is reversed affect.

Comment author: lfghjkl 18 June 2015 07:36:08PM *  0 points [-]

If you equally lean towards two ideas, but like one more, that suggests you subconsciously find that less true.

And it could also mean that you just think the evidence for that proposition is better. Your argument looks more like post-hoc reasoning for a preferred conclusion rather than something that is empirically true.

Reversed stupidity is a different thing.

I'm sorry, but if you subconsciously like a false idea more often than chance then this quote still applies:

If you knew someone who was wrong 99.99% of the time on yes-or-no questions, you could obtain 99.99% accuracy just by reversing their answers. They would need to do all the work of obtaining good evidence entangled with reality, and processing that evidence coherently, just to anticorrelate that reliably. They would have to be superintelligent to be that stupid.

You cannot determine the truth of a proposition from whether you like it or not, you have to look at the evidence itself. There are no short-cuts here.