I was recently re-reading a piece by Yvain/Scott Alexander called Epistemic Learned Helplessness. It's a very insightful post, as is typical for Scott, and I recommend giving it a read if you haven't already. In it he writes:
When I was young I used to read pseudohistory books; Immanuel Velikovsky's Ages in Chaos is a good example of the best this genre has to offer. I read it and it seemed so obviously correct, so perfect, that I could barely bring myself to bother to search out rebuttals.
And then I read the rebuttals, and they were so obviously correct, so devastating, that I couldn't believe I had ever been so dumb as to believe Velikovsky.
And then I read the rebuttals to the rebuttals, and they were so obviously correct that I felt silly for ever doubting.
And so on for several more iterations, until the labyrinth of doubt seemed inescapable.
He goes on to conclude that the skill of taking ideas seriously - often considered one of the most important traits a rationalist can have - is a dangerous one. After all, it's very easy for arguments to sound convincing even when they're not, and if you're too easily swayed by argument you can end up with some very absurd beliefs (like that Venus is a comet, say).
This post really resonated with me. I've had several experiences similar to what Scott describes, of being trapped between two debaters who both had a convincingness that exceeded my ability to discern truth. And my reaction in those situations was similar to his: eventually, after going through the endless chain of rebuttals and counter-rebuttals, changing my mind at each turn, I was forced to throw up my hands and admit that I probably wasn't going to be able to determine the truth of the matter - at least, not without spending a lot more time investigating the different claims than I was willing to. And so in many cases I ended up adopting a sort of semi-principled stance of agnosticism: unless it was a really really important question (in which case I was sort of obligated to do the hard work of investigating the matter to actually figure out the truth), I would just say I don't know when asked for my opinion.
[Non-exhaustive list of areas in which I am currently epistemically helpless: geopolitics (in particular the Israel/Palestine situation), anthropics, nutrition science, population ethics]
All of which is to say: I think Scott is basically right here, in many cases we shouldn't have too strong of an opinion on complicated matters. But when I re-read the piece recently I was struck by the fact that his whole argument could be summed up much more succinctly (albeit much more pithily) as:
"Don't be gullible."
Huh. Sounds a lot more obvious that way.
Now, don't get me wrong: this is still good advice. I think people should endeavour to not be gullible if at all possible. But it makes you wonder: why did Scott feel the need to write a post denouncing gullibility? After all, most people kind of already think being gullible is bad - who exactly is he arguing against here?
Well, recall that he wrote the post in response to the notion that people should believe arguments and take ideas seriously. These sound like good, LW-approved ideas, but note that unless you're already exceptionally smart or exceptionally well-informed, believing arguments and taking ideas seriously is tantamount to...well, to being gullible. In fact, you could probably think of gullibility as a kind of extreme and pathological form of lightness; a willingness to be swept away by the winds of evidence, no matter how strong (or weak) they may be.
There seems to be some tension here. On the one hand we have an intuitive belief that gullibility is bad; that the proper response to any new claim should be skepticism. But on the other hand we also have some epistemic norms here at LW that are - well, maybe they don't endorse being gullible, but they don't exactly not endorse it either. I'd say the LW memeplex is at least mildly friendly towards the notion that one should believe conclusions that come from convincing-sounding arguments, even if they seem absurd. A core tenet of LW is that we change our mind too little, not too much, and we're certainly all in favour of lightness as a virtue.
Anyway, I thought about this tension for a while and came to the conclusion that I had probably just lost sight of my purpose. The goal of (epistemic) rationality isn't to not be gullible or not be skeptical - the goal is to form correct beliefs, full stop. Terms like gullibility and skepticism are useful to the extent that people tend to be systematically overly accepting or dismissive of new arguments - individual beliefs themselves are simply either right or wrong. So, for example, if we do studies and find out that people tend to accept new ideas too easily on average, then we can write posts explaining why we should all be less gullible, and give tips on how to accomplish this. And if on the other hand it turns out that people actually accept far too few new ideas on average, then we can start talking about how we're all much too skeptical and how we can combat that. But in the end, in terms of becoming less wrong, there's no sense in which gullibility would be intrinsically better or worse than skepticism - they're both just words we use to describe deviations from the ideal, which is accepting only true ideas and rejecting only false ones.
This answer basically wrapped the matter up to my satisfaction, and resolved the sense of tension I was feeling. But afterwards I was left with an additional interesting thought: might gullibility be, if not a desirable end point, then an easier starting point on the path to rationality?
That is: no one should aspire to be gullible, obviously. That would be aspiring towards imperfection. But if you were setting out on a journey to become more rational, and you were forced to choose between starting off too gullible or too skeptical, could gullibility be an easier initial condition?
I think it might be. It strikes me that if you start off too gullible you begin with an important skill: you already know how to change your mind. In fact, changing your mind is in some ways your default setting if you're gullible. And considering that like half the freakin sequences were devoted to learning how to actually change your mind, starting off with some practice in that department could be a very good thing.
I consider myself to be...well, maybe not more gullible than average in absolute terms - I don't get sucked into pyramid scams or send money to Nigerian princes or anything like that. But I'm probably more gullible than average for my intelligence level. There's an old discussion post I wrote a few years back that serves as a perfect demonstration of this (I won't link to it out of embarrassment, but I'm sure you could find it if you looked). And again, this isn't a good thing - to the extent that I'm overly gullible, I aspire to become less gullible (Tsuyoku Naritai!). I'm not trying to excuse any of my past behaviour. But when I look back on my still-ongoing journey towards rationality, I can see that my ability to abandon old ideas at the (relative) drop of a hat has been tremendously useful so far, and I do attribute that ability in part to years of practice at...well, at believing things that people told me, and sometimes gullibly believing things that people told me. Call it epistemic deferentiality, or something - the tacit belief that other people know better than you (especially if they're speaking confidently) and that you should listen to them. It's certainly not a character trait you're going to want to keep as a rationalist, and I'm still trying to do what I can to get rid of it - but as a starting point? You could do worse I think.
Now, I don't pretend that the above is anything more than a plausibility argument, and maybe not a strong one at that. For one I'm not sure how well this idea carves reality at its joints - after all, gullibility isn't quite the same thing as lightness, even if they're closely related. For another, if the above were true, you would probably expect LWer's to be more gullible than average. But that doesn't seem quite right - while LW is admirably willing to engage with new ideas, no matter how absurd they might seem, the default attitude towards a new idea on this site is still one of intense skepticism. Post something half-baked on LW and you will be torn to shreds. Which is great, of course, and I wouldn't have it any other way - but it doesn't really sound like the behaviour of a website full of gullible people.
(Of course, on the other hand it could be that LWer's really are more gullible than average, but they're just smart enough to compensate for it)
Anyway, I'm not sure what to make of this idea, but it seemed interesting and worth a discussion post at least. I'm curious to hear what people think: does any of the above ring true to you? How helpful do you think gullibility is, if it is at all? Can you be "light" without being gullible? And for the sake of collecting information: do you consider yourself to be more or less gullible than average for someone of your intelligence level?
I haven't lived in Eastern Europe for about 10 years now. When I did it felt a lot like a "gangsta" culture, like in GTA: San Andreas esp. in the nightlife / club scene, big buff aggressive guys and stripperish girls with infantile Hello Kitty accessories - does that come accross as functional? I have lived in the UK which is probably the closest to the US culture around here - I must admit I did not like much the music pubs with the fat girls being drunk and cussing and even fighting as if they were male sailors, but as my expertise was in manufacturing software, I lived in a really industrial, read, PROLE area, near Dudley, so that is not really a good sample. It is just prole culture for the most part. Now living in Vienna the only serious social dysfunction I see is everybody being fscking old - it has a retirement home vibe. Demographics screwed up.
But what does it have to do with the problem I raised with the word access? The problem I raised is that it is a dehumanizing term that ignores the romantic and loving aspects of relationships, even ignores how sex is a mutual pleasing participating act, it objectifies women as something passive and handing out sex as rewards, basically it has something akin to a prostitution vibe. This is not how a healthy relationship works. Not even how a healthy one night stand - it is based on mutual desire and mutual escalation. It feels incredibly transactional at best and objectifying at worst.
But I am not trying to raise a moral finger here. The issue is not that this is morally wrong, the issue is the inferential distance, that there is not one objectively examinable set of human behaviors but the author and me think/talk about entirely differently behaving humans. How the heck to find a rational conclusion in that? There is hardly a shared set of experience because there is hardly a shared value or goal or motive.
Yes, but the motives would be entirely different - and yes, they matter. The human mind is apparently too well optimized to win arguments instead of be right. Which suggests listening to arguments is not even a good way to find truth but even when you do at least you need to have some idea about the personality of the other, their motives, where are they coming from and where they want to go. You have to be at least the same tribe, in the sense of shared motives and goals. This is even true in physics - the difference being that academia has a very good institutional setup for sharing goals and motivations and values. Academia built a tribe in natural science. Go outside academia and you find the same mess - "Vedic science" guys arguing with UFO believers and so on. Cross-tribal it doesn't work.
The point is that from what I heard Hungary is a culture where someone whose "interest in women is loving them, being loved by them, and making love, in that order" has a chance of winding up with a woman.
What do you m... (read more)