Related to: Two Kinds of Irrationality and How to Avoid One of Them

When I was a teenager, I picked up my mom's copy of Dale Carnegie's How to Win Friends and Influence People. One of the chapters that most made an impression on me was titled "You Can't Win an Argument," in which Carnegie writes:

Nine times out of ten, an argument ends with each of the contestants more firmly convinced than ever that he is absolutely right.

You can’t win an argument. You can’t because if you lose it, you lose it; and if you win it, you lose it. Why? Well, suppose you triumph over the other man and shoot his argument full of holes and prove that he is non compos mentis. Then what? You will feel fine. But what about him? You have made him feel inferior. You have hurt his pride. He will resent your triumph. And -

"A man convinced against his will 

"Is of the same opinion still."

In the next chapter, Carnegie quotes Benjamin Franklin saying how he had made it a rule never to contradict anyone. Carnegie approves: he thinks you should never argue with or contradict anyone, because you won't convince them (even if you "hurl at them all the logic of a Plato or an Immanuel Kant"), and you'll just make them mad at you.

It may seem strange to hear this advice cited on a rationalist blog, because the atheo-skeptico-rational-sphere violates this advice on a routine basis. In fact I've never tried to follow Carnegie's advice—and yet, I don't think the rationale behind it is completely stupid. Carnegie gets human psychology right, and I fondly remember reading his book as being when I first really got clued in about human irrationality.

It's important that people's resistance to being told they're wrong is quite general. It's not restricted to specific topics like religion or politics. The "You Can't Win an Argument" chapter begins with a story about a man who refused to accept that the quotation "There's a divinity that shapes our ends, rough-hew them how we will" came from Hamlet rather than the Bible. Carnegie correctly identifies the reason people can be irrational about such seemingly unimportant questions: pride.

In fact, if Carnegie's book has one overarching theme, it's the incredible power of the human need to think highly of ourselves (individually, not as a species). It opens with stories of a number of gangsters who insisted against all evidence that they were good people (including Al Capone, and a couple of now-forgotten names that were contemporary references at the time the book was written in 1936). By the end of that first chapter, those examples have been spun into what I suppose was intended to be a positive, upbeat message: "Don't criticize, condemn, or complain."

It had the probably unintended effect, though, of helping to give me a deep cynicism about human nature, a cynicism which persists to this day. In particular, I saw in a flash that what Carnegie was saying implied you could get people to support some deeply horrible causes, as long as you presented the cause in a way that told them how wonderful they are. I think I even had an inkling at the time that there was some evolutionary explanation for this. I can't claim to have exactly derived Robert Trivers' theory of self-deception on my own, but I certainly was primed to accept the idea when I got around to reading Steven Pinker in college.

(Around very roughly the same time as I read How to Win Friends and Influence People, I read Homer's epics, which served as the other early building block in my present cynicism. It was Homer who taught me there had once been a culture that held that raping women taken captive in war was a perfectly normal thing to do, even suitable behavior for "heroes.")

But such cynicism is a post for another day. When it comes to rationality, the effect of Carnegie's book was this: even after having read all of the sequences and all of HPMOR, I still think that the human need to think highly of ourselves is a far more important source of human irrationality than oh, say, the fundamental attribution error or the planning fallacy. It does seem foolish to be so strongly influenced by one book I read in my early teens, but on the other hand the evidence I've encountered since then (for example learning about Trivers' theory of self-deception) seems to me to confirm this view.

So why do I go on arguing with people and telling them they're wrong in spite of all this? Well, even if nine times out of ten arguing doesn't change anyone's mind, sometimes the one time out of ten is worth it. Sometimes. Not always. Actually, with most people I'm unlikely to try to argue with them in person. I'm much more likely to argue when I'm in a public internet forum, when even if I don't persuade the person I'm directly talking to, I might persuade some of the lurkers.

Now there are various tactics for trying to change people's minds without directly telling them they're wrong. Bryan Caplan's The Myth of the Rational Voter has a section on how to improve undegraduate economics classes, which includes the observation that: "'I'm right, you're wrong,' falls flat, but 'I'm right, the people outside this classroom are wrong, and you don't want to be like them, do you?' is, in my experience, fairly effective." Of course, this doesn't work if the other person has definitely made up their mind.

There's also the Socratic Method, which Carnegie sings the praises of. I think many people get the wrong idea about the Socratic method, because the most famous source for it is Plato's dialogues, which are works of fiction and tend to have things go much better for Socrates than they ever would in real life. From reading Xenophon's Memorabiliamy impression is that the historical Socrates was probably something of a smartass who was not very good at winning friends or influencing most of his immediate contemporaries. (They did vote to kill him, after all.)

There may be a version of the Socratic method that's more likely to actually make progress changing people's minds. I recently read Peter Boghossian's A Manual for Creating Atheists, a how-to book for atheists who want to get better at talking to believers about religion. Boghossian's approach is heavily inspired by Socrates, and the examples of conversation he gives, based on actual conversations he's had with believers, are far more believable than Plato's—indeed, I'm left wondering if he used a tape recorder.

What most stands out about those conversations is Borghossian's patience. He politely keeps asking questions as the conversation seemingly goes round in circles, sometimes even shutting up and listening as his interlocutors spend several minutes basically repeating themselves, or going off on a tangent about the leadership structure of their church.

I bet Borghossian's techniques are great if you have the time and patience to master and apply them—but you won't always have that. So while I recommend the book, I don't think it will always be an alternative to sometimes straight-up telling people they're wrong.

Oh, and then there's just plain oldfashioned trying to be polite and direct at the same time. But that doesn't always work either. As Daniel Dennett once said, "I listen to all these complaints about rudeness and intemperateness, and the opinion that I come to is that there is no polite way of asking somebody: have you considered the possibility that your entire life has been devoted to a delusion?"

In spite of all this, there's still a tradeoff you're making when you criticize people directly. I've known that for roughly half my life, and have often made the tradeoff gladly. I tend to assume other rationalists know this too, and make the tradeoff consciously as well.

But sometimes I wonder.

How many people on LessWrong realize that when you tell someone their AI project is dangerously stupid, or that their favorite charity is a waste of money, you risk losing them forever—and not because of anything to do with the the subtler human biases, but just becasue most people hate being told they're wrong?

If you are making a conscious tradeoff there, more power to you! Those things need saying! But if you're not... well, at the very least, you might want to think a little harder about what you're doing.

New Comment
80 comments, sorted by Click to highlight new comments since: Today at 10:08 PM
Some comments are truncated due to high volume. (⌘F to expand all)Change truncation settings

"A man convinced against his will Is of the same opinion still."

I think you need a longer time span to see this is quite often false. What has happened many times is I argue with my friend or my parent and "win" while they're defending their position to the teeth. Months later, they present my argument to me as their own as if the previous discussion never happened. Some people's forgetfulness amazes me, but I suspect I've changed my mind this way without noticing too.

Admitting you're wrong is quite different from changing your mind. Even so, I hopefully don't argue to win these days anymore.

Yes yes yes yes yes.
This still amazes me every time it happens.

It can also happen without the "winning." That is, I've had experiences like:
Sam: "A, because X."
Me: "Well... I dunno. I mean, I've experienced X, sure, and I agree that X is evidence of A. But I've also experienced Y and Z, which seem to me to be evidence of B. And if I take (X and Y and Z) all together, (B and NOT A) seems much more plausible than (B and A)."
Sam: "I still think A."
Me: "Well, I agree that (X and Y and Z and A and B) is not absurdly improbable, I just think it's less likely than (X and Y and Z and not-A and B). So you might be right."
(wait a while)
Sam: "Not-A, because B."

And I know for certain that I've been Sam in exchanges like this as well. Worse, I know that in some cases I toggle. This is embarrassing, but it is what it is.

3hyporational10y
How do you know? Have people told you this? I'd be interested to hear about a toggling situation. I guess in some cases people know they've changed their mind, just not remember who exactly they had the crucial discussion with, so they don't realize they're admitting to you they were wrong. Most cases I remember can't be explained this way because I'm probably the only one they've discussed these particular topics with. In some cases you can sort of plant seeds in their minds and watch them grow over time without them noticing.

How do you know? Have people told you this?

Yeah, essentially. I've been living with the same guy for 20 years, and when he reminds me that I've said "A" in the past I can remember having said A, despite believing that I've always believed not-A, and it seems more likely that I'm mis-remembering my own earlier beliefs than that I was lying to him at the time. Similarly, when he reminds me that he's previously reminded me that I've said "A" in the past and I've had trouble believing that, I can remember that conversation, despite believing that I've always believed A.

Of course, it's certainly possible that I'm just being suggestible and editing memories realtime, but it doesn't feel that way. And if I'm that suggestible, which I might very well be, that makes it even more plausible that I've toggled. So I'm pretty confident that he's right.

What has happened many times is I argue with my friend or my parent and "win" while they're defending their position to the teeth. Months later, they present my argument to me as their own as if the previous discussion never happened.

I have known one person for whom this was a deliberate policy. He would never (he said) admit to changing his mind about anything. If he did change his mind as a result of an argument, he would merely cease advocating the view he now thought erroneous, and after some suitable lapse of time, advocate what he now believed, as if he had believed it all along.

8Alicorn10y
Did he have a reason for this policy?

Not that he said, but I guess it was a status thing. Another curious feature of his discourse was that on mailing lists he would never post a direct reply to anything, with a "Re:" in the subject line. He engaged with the conversations, but always framed his postings as if they were entirely new contributions -- as if one were to participate here by only posting top level articles. I assume this was also about status.

1JackV10y
FWIW, I always struggle to embrace it when I change my mind ("Yay, I'm less wrong!") But I admit, I find it hard, "advocating a new point of view" is a lot easier than "admitting I was wrong about a previous point of view", so maybe striving to do #1 whether or not you've done #2 would help change my mind in response to new information a lot quicker?
2arundelo10y
If you don't mind saying, did you like this guy? Just from this comment, I think he's an asshole, but maybe I'd think differently if I knew him.
4Richard_Kennaway10y
I didn't know him personally, he was just on a couple of mailing lists I was on. I don't think I would have cared to.
9KnaveOfAllTrades10y
+1 this memory thing is a thing.
3hyporational10y
Good to know someone else is experiencing this too. It probably relates to the cognitive dissonance thing Chris was talking about in the earlier post, which is the reason I suspect I might be making the mistake too. I think it deserves a name of it's own, if it doesn't already have one.
7DanielLC10y
I don't think it's just forgetfulness. I've had my mind changed by an argument. What has never happened is for my mind to be changed during an argument (barring cases where I can just look it up). Changing your mind takes some serious time and thought, as it should. It's not that I don't want to lose face admitting I'm wrong. It's that I don't feel like bringing it up again.
1hyporational10y
I agree it's not just forgetfulness and what you're saying probably happens more often. The situations I had in mind though cannot be explained your way. I agree, but I think often most of that transformation happens subconsciously. I think we should rather focus on changing people's minds than getting them to admit they're wrong.
2owencb10y
I once went to a talk in which Christopher Zeeman modelled this behaviour using catastrophe theory. I'm not sure you need the mathematics for the thesis, which was (roughly) that arguing for your position pushes people towards it in their underlying beliefs, but also pushes people to be more defensive about their initial beliefs (because it's a conflict situation). When they go away afterwards and calm down, they may find that they have moved towards your your position ... without necessarily remembering the argument as having any part in it. He claimed to have applied this theory successfully to push a committee he was on, by making a big fuss months before the final decision on the topic was needed.
2christopherj10y
No, it is quite often true, though obviously not an absolute. I've seen people concede to an argument, only to entirely forget about it and start from scratch from their original position. I know that they weren't just pretending because I have done this myself -- and if it weren't for the internet and some dude pointing back to my history, I'd have never known about it. A deeply held belief is well-entrenched in memory, whereas a change of mind and the rationale for the change can be easily forgotten. I've also done like you said, and incorporated other people's arguments against me as my own, after I've changed my mind. After all, a valid argument is valid no matter who made it, and I know I am very prone to source amnesia. My experience arguing with other people is that you cannot change a person's deeply held beliefs (you can be one of many contributors to an eventual change, but there's no way you alone can do it). If you want any chance of success changing someone's mind, correct them on easily verifiable facts, not on complex topics.
1hyporational10y
It can be both quite often true and quite often false. Thank you especially for "source amnesia", it's perhaps the most important part of the phenomenom, though doesn't fully explain it. My experience is it depends on the person, and how much their environment reinforces those beliefs. While this is good advice, I think people incorporate much more from an argument than they explicitly accept or recall. It takes time for good arguments to sink in, and that can happen even if you don't consciously think about them.
0TheAncientGeek10y
Are you sure it's forgetfullness? if someone is after status, it is instrumentally rational to never admit to updating, and it is also instrumentally rational to update to better arguments. They are exercising a rational have-your-cake-and-eat-it strategy.
2Ishaan10y
Why not both? (in which unintentional selective forgetfulness causes optimal status seeking behavior) Evolution makes creatures which are instrumentally rational to some extent, but they don't necessarily need to be instrumentally rational on purpose.
0ChrisHallquist10y
This is definitely A Thing that happens. But I still think it's more common for people to somehow rationalize not changing their mind, even in the long run.
0hyporational10y
I wasn't suggesting this is the only way people change their minds. I think we should concentrate on how to make it happen more often, not necessarily caring about people admitting they're wrong.

Magic the gathering analogy time!

Arguing is like getting to look at the top card of your deck and then put it on the bottom if you wish ("scrying for 1"). When you scry, what you want is to see that the top card of your deck is great and doesn't need to be bottomed. But in that case you gained very little from scrying - the actual value of scrying comes when your top card is terrible, and you need to get rid of it.

Which is to say that you don't want to lose an argument, but that losing and actually changing your mind is a major way of extracting value from them.

Leaving a line of retreat is standard LessWrong advice and seems to fit this theme well.

Defecting by accident seems to fit too. It's more about how to tell people they're wrong.

6JQuinton10y
Also related is keeping your identity small.

Another perspective on why it's hard to meaningfully win an argument: epistemic learned helplessness.* Most people, though perhaps not most people on this site, have known someone who could argue circles around them and "win" nearly any argument, to the point where "losing" an argument is so sure either way that it's not even evidence of being wrong. If the fact that I've "lost" an argument (been confronted with reasoning that I am unable to effectively refute on a conversational timescale) forces me to actually change my mind, I could end up believing anything.

Just because my argument for why I like ham sandwiches is full of holes doesn't mean I don't really like ham sandwiches.

*Edit: blog has been locked since I posted this comment. See archived version of this post at https://web.archive.org/web/20130114194332/http://squid314.livejournal.com/350090.html. Bare URL because when I put in the actual hyperlink, LW gets confused and inserts a "<" in the middle, breaking the link.

3Error10y
I've known someone like this. In other contexts, I've been someone like this. It's not pleasant.

Great post. This sort of perspective is something that I'd definitely like to see more of on LessWrong.

9ChrisHallquist10y
Thanks. You may be interested to know that I originally considered titling this post "Being Told You're Wrong Is the Mind-Killer."

Personally, I'm glad you decided not to.

I agree, mind-killer is too much of an applause light is an applause light these days.

Actually, the reason for that title was because of a point I decided to leave out, but may as well spell out here: "Deciding to talk about politics, even though this may cause you to lose some of your audience" and "Deciding to tell people they're wrong, even though this may cause you to lose some of your audience" are both tradeoffs, and it's odd that LessWrong community norms go so far in one direction on one tradeoff and so far in the other direction on the other tradeoff (at least with regards to certain subjects).

I suspect the reason for this mostly has to do with Eliezer thinking politics are not very important, but also thinking that, say, telling certain people their AI projects are dangerously stupid is very important. But not everyone agrees, and the anti-politics norm is itself a barrier to talking about how important politics are. (Personally, I suspect government action will be important for the future of AI in large part because I expect large organizations in general to be important for the future of AI.)

5katydee10y
Yeah, I saw the parallel there. I more or less think that both talking about politics and explicitly telling people that they're wrong are usually undesirable and that LessWrong should do neither. I also agree with you that government action could be important for the future of AI.
3hyporational10y
Telling people they are wrong is almost explicitly about rationality, but we should definitely think about how to do that. If I'm wrong, I want to know that and there's a clear benefit in people telling me that. I don't see any clear benefit in discussing politics here, so I'm not even sure what the tradeoff is. It's not that politics are not important, but that there's not much we can do about them. I'd be very interested in a post explaining why discussing politics is more important than other things, not why politics is important, for this rather small rationalist community. I'm not sure he has bluntly told that to anyone's face. I think he's saying these things to educate his audience, not to change his opponents' minds. This I might agree with but it doesn't justify talking about other political topics. This particular topic also wouldn't be a mind killer because it's not controversial here and any policies regarding it are still distant hypotheticals.
3Vaniver10y
Well...
0hyporational10y
I see. I'd rather suspect that person wasn't all that important, nor was the audience at that dinner party, but maybe that's just wishful thinking. I also suspect he's learned some social skills over the years.
0Vaniver10y
In the comments, he makes clear he held the "losing an argument is a good thing, it's on you if you fail to take advantage of it" position. He may no longer feel that way.
0[anonymous]10y
I see :)
1christopherj10y
I have had experience as a moderator at a science forum, and I can tell you that almost all of our moderating involved either A) the politics subforum, or B) indirect religious arguments, especially concerning evolution (the religion subforum was banned before my time due to impossibly high need for moderation). The rest was mostly the better trolls and people getting frustrated when someone wouldn't change their mind on an obvious thing. However, I must say I don't see how people can discuss rationality and how people fail at it without someone telling someone else that they're wrong. After all, the major aspect of rationality is distinguishing correct from incorrect. Incidentally, I've been really impressed at the quality of comments and users on this site. Consider what this user has observed about LW before you complain about how politics is not allowed.
2AlexSchell10y
I think you accidentally went up one meta level.
3Vaniver10y
If you read the "I agree" as sarcastic, then it looks like the right meta level. (I'm not sure it's a good thing I thought that was more plausible than the accident hypothesis when I first parsed that sentence.)
0hyporational10y
Not sarcasm, although now that you mentioned it I can definitely see it, just well intentioned humour. See the other comment :)
1hyporational10y
As I was writing the comment, I realized applause light is an applause light too, so I decided to make fun of that.

Like a lot of advice of this sort, this benefits from being flipped around: become able to lose arguments so you can learn from them (which is the real winning). I don't have much concrete advice on doing that, but I know it's possible because some people are more capable than others. Being surrounded by a community that respects mind-changing helps. Simply being aware of the problem also might. As might having something to protect (though probably only when you actually do).

5Cyan10y
One tactic I use to avoid avoid argument is to make conditional claims that are correct as a matter of structure without committing myself to any particular premises. I also use the following strategy for avoiding a certain kind of wrong statement (which seems to help me concede arguments): I try to avoid making statements about the way the world is; instead I make statements about my thoughts and beliefs about the way the world is. If something changes my mind, my previous statements remain correct as descriptions of what I thought at the time. The effect is that when I surrender to the truth, I say things like,"That prior belief of mine was wrong, and you've convinced me to discard it," but not things like, "That previous statement of mine was wrong."

You can’t win an argument

I'd add the caveat "it can't be known that you've won the argument". I've been in several conversations where I've got people to essentially argue away their entire position - but only as long as you don't point it out to them. As long as they can feel they haven't lost, they can end up with very different positions from their starting positions.

Firm endorsement of Carnegie, and firm endorsement of applying this rule basically everywhere. Even on Less Wrong, I do my best to clash with others' pride as little as possible. I've found the Socratic method to work fairly well. In my experience, it is most useful at finding your misunderstandings and preventing them from causing you trouble, which is well worth it.

I've had bad experiences using the Socratic method on people who are trying to win. I ask a question and they wander away from it to reiterate all of their points. And now I've used up my talking quota for a while.

On people who start out wanting to learn, it can be very effective.

1christopherj10y
If someone is trying to win an argument, the odds of your convincing them are very low regardless of how right you are. The point of arguing would be to make him lose, that is, to convince the audience and not the opponent. This is a big difference between internet discussions and in person discussions -- in real life the audience is often zero or small, and a person's wrongness will be quickly forgotten, but when someone is wrong on the internet, it is more important.
0aletheianink10y
I agree. I think most people just want to talk at you, not with you, when they're determined to win, and very few people would ever follow a conversation the way Socrates' opponents do in Plato's works.
2hyporational10y
Yes, it's generally good to try not to vocally disagree with something you don't understand. I've seen TheOtherDave skillfully apply the Socr... ehm, Miller's law many times here, also on me.
9TheOtherDave10y
I am often described as applying the Socratic method, and I understand why, and I don't strictly speaking disagree... but I find the phrase "Socratic method" problematically ambiguous between Vaniver's usage and Plato's. In Plato's dialogs, what Socrates is mostly trying to lead his interlocutor into a contradiction, thereby demonstrating that the man is in fact ignorant. I do that sometimes, but what I'm usually trying to do is apply Miller's Law.
0Vaniver10y
I agree that this ambiguity exists, and dislike that it exists. Generally, when there's an ambiguity between ancient use and modern use, I go with the modern use, because moderns read what I write much more than ancients do. The phrase seems to have broadened to "lead by questioning," not necessarily to a contradiction, which even then isn't quite right, because I often want to lead them to a clear description of what they think, not someplace I've decided on. I probably ought to just call it "gentle questioning." (For example, I think "man" was a beautifully inclusive word- it originally meant "mind," and so meant basically "all sapient beings," and so things like "one giant leap for mankind" also includes any of our robotic descendants, say- but that was at a time when "male adult" was "wer" and "female adult" was "wif," and now that both of those are out of style "man" mostly means "male adult.")
4TheOtherDave10y
Absolutely agreed on all counts, but I find that the ancient (and currently mostly negative) usage of "Socratic method" is still alive enough in my social circle that it's worth taking into consideration.
0hyporational10y
Yes, well this certainly isn't what I meant so it's good that you brought this up. Miller's law describes you better. I suspect Socratic method has invaded common language to the point it has lost some of its original meaning.
0TheOtherDave10y
Absolutely. As I say, I find the term ambiguous, but I don't disagree with the claim. That said, I have often run into the problem of people getting extremely anxious when asked questions that don't quite fit into their script of how the conversation is supposed to go. Sometimes that's just because their beliefs are fragile and rigidly held, but sometimes it seems more that they're afraid I'm going to try to make them look/feel stupid. So it's something I try to stay aware of. EDIT: And I just noticed your edit, and was amused.

It had the probably unintended effect, though, of helping to give me a deep cynicism about human nature, a cynicism which persists to this day.

I've had similar experiences:

-People refusing to draw conclusions that cast them in a negative light, and directing sadness / anger / annoyance at me for being critical.

-People accepting conclusions that cast them in a negative light, and subsequently reacting with sadness and self-anger.

However, Take Caution - this can lead to the following problem: "Person X has problem A, but what's the point of telling t... (read more)

I still that the human need to think highly of ourselves is a far more important source of human rationality

Missing verb after still, and I also think rationality should be irrationality.

2hairyfigment10y
I also think "as if" should be "if (the other person has definitely made up their mind)."
1ChrisHallquist10y
Thanks to both. Fixed.

To your alternative approaches I would also add Bruce Schneier's advice in Cryptographic Engineering, where he talks a little about the human element in dealing with clients. It's similar to the Socratic approach, in that you ask about a possible flaw rather than argue it exists.

Bad: "that doesn't work. Someone can just replay the messages."

Good: "what defenses does this system have against replay attacks?"

I think this happens because it takes skill to accept being wrong. I know this has essentially been mentioned on LW before (my most recent reading was in MrMind's comment on the 5 Second Level), but I don't think most people have learnt that skill.

What we learn is that if we say "yes, I was wrong", others have then jumped on us, made fun of us or made an example of us - this starts when we're kids, or in school, where if we happen to be around teachers or parents with an inferiority complex, we've quickly learnt that it's better to be absolutely... (read more)

7Ishaan10y
Had an interesting experience once - I have a reputation among friends for refusing to concede anything in arguments. I was debating something with a friend, and she said something I hadn't considered, to which I responded: "Oh, really? Okay, I change my mind." Her response?: "Wow...that wasn't nearly as satisfying as I thought it would be." Felt good. /bragging Moral of the story: The "winning-losing" social dynamic of admitting you were wrong is subverted if the other person perceives that you were not emotionally invested in your opinion...which is signaled by changing your mind quickly after the mind-changing argument or information is presented, and giving positive affect cues upon mind-change (similar to what you'd give after seeing the solution to a math problem).
2hyporational10y
I think this relates to the more general point that people usually find it more satisfying to get something they've worked for than to get things for free.
1hyporational10y
Admitting you're wrong is not necessary for changing your mind. I think they're two different skills. Upvoted for the first two thirds.
0aletheianink10y
Good point - I interchanged the two too readily.

I have never been able to get the Socratic Method to work on the Internet. In theory the Socratic Method is effective because the student has to reason their own way to the conclusion, and so they end up knowing it more deeply and thoroughly than if they were just told the conclusion by the teacher. But somehow it never works for me.

I think part of the problem is that the Socratic Method relies on the participants agreeing to take on the appropriate roles in the discussion. In particular, the "student" has to agree to play the role of the student... (read more)

FWIW, I've stopped using the Socratic Method, because, in my experience, it always elicits a strong and immediate negative reaction. People basically respond to it by saying, "stop asking me questions to which you obviously already have all the answers; do you think I'm stupid and wouldn't notice ?"

1ChrisHallquist10y
This is definitely a pitfall of the Socratic Method done badly. I suspect it really is possible to be more subtle about it, but that's tricky.

I generally agree (a lot) with this principle, especially during direct, in-person discussions. Though I still remain persuaded that there is a place for contradiction -- and even explicitly ridicule of ideas in argument.

I'm thinking specifically of my experience with religion. You mentioned the example of lurkers being able to access direct arguments. For some large chunk of the fundamentalist theist lurker crowd out there, polite, Socratic-styled arguments against their religion may not do the trick. This, I think, is because (1) theists are super good a... (read more)

I listen to all these complaints about rudeness and intemperateness, and the opinion that I come to is that there is no polite way of asking somebody: have you considered the possibility that your entire life has been devoted to a delusion?

How about "Do you ever wonder if your entire life has been devoted to a delusion? It's a frightening possibility, and most horrifying of all, no matter what actually ends up being true, you have to agree that most people do this."

Admittedly, I suspect a lot of people would completely miss the point and tell ... (read more)

0TheOtherDave10y
Why? I'm not asking whether my entire life might have been devoted to a delusion... of course it might have been. I'm asking why I should be worried about it.
0DanielLC10y
I suppose I made an unfounded assumption here. Do you have something to protect? If not, I guess it doesn't really matter, although in that case your life can't really be said to be devoted to anything at all, so clearly it wasn't devoted to an illusion. If so, then you simply cannot afford to be wrong.
1TheOtherDave10y
Let's say I'm fiercely devoted to protecting X, to which my life is devoted. If you were suggesting that I ought to be carefully attending to things related to X and carefully deriving beliefs from those observations, I would agree with you. But no matter how assiduously I do that, a non-zero possibility will always exist that I am deluded about X. So I ask again: why should I worry about that possibility?
0DanielLC10y
Given that you are deluded, if you try to figure out how you might be deluded, you are more likely to end, or at least decrease, the delusion than if you do not. Ending the delusion will help you protect X. The possibility isn't just non-zero. It's significant. From an outside view, you probably are deluded. Should a Christian worry about being deluded? How, from the inside, could you tell yourself apart from them?
0TheOtherDave10y
Yes, I agree. By carefully attending to things related to whatever it is that is like Christianity to me in this example, and carefully deriving my beliefs from those observations. If I saw a Christian who was doing that, I would not encourage them to worry about being deluded; I would encourage them to keep doing what they're doing. And if I saw a Christian who was worried about being deluded but not attending to their environment, I would encourage them to worry less and pay more attention. And the same goes for a non-Christian.
[-][anonymous]10y00

How many people on LessWrong realize that when you tell someone their AI project is dangerously stupid, or that their favorite charity is a waste of money, you risk losing them forever—and not because of anything to do with the the subtler human biases, but just becasue most people hate being told they're wrong?

Well, the problem is, these two specific examples simply are not true. Many charities are reasonably effective in their stated purpose, even if "effective altruism" believers would hold that they are strictly suboptimal in terms of hum... (read more)

[-][anonymous]10y00

Relevant other post: Defecting by Accident - A Flaw Common to Analytical People

whoops, redundant, sorry!

[This comment is no longer endorsed by its author]Reply