How to understand people better

76 pwno 14 October 2011 07:53PM
I’ve been taking notes on how I empathize, considering I seem to be more successful at it than others. I broke down my thought-patterns, implied beliefs, and techniques, hoping to unveil the mechanism behind the magic. I shared my findings with a few friends and noticed something interesting: They were becoming noticeably better empathizers. 

I realized the route to improving one’s ability to understand what people feel and think is not a foreign one. Empathy is a skill; with some guidance and lots of practice, anyone can make drastic improvements. 

I want to impart the more fruitful methods/mind-sets and exercises I’ve collected over time. 

Working definitions:
Projection: The belief that others feel and think the same as you would under the same circumstances
Model: Belief or “map” that predicts and explains people’s behavior


Stop identifying as a non-empathizer

This is the first step towards empathizing better—or developing any skill for that matter. Negative self-fulfilling prophecies are very real and very avoidable. Brains are plastic; there’s no reason to believe an optimal path-to-improvement doesn’t exist for you. 

Not understanding people's behavior is your confusion, not theirs

When we learn our housemate spent 9 hours cleaning the house, we should blame our flawed map for being confused by his or her behavior. Maybe they’re deathly afraid of cockroaches and found a few that morning, maybe they’re passive aggressively telling you to clean more, or maybe they just procrastinate by cleaning. Our model of the housemate has yet to account for these tendencies. 
continue reading »

Rationalist horoscopes: A low-hanging utility generator.

62 AdeleneDawner 22 May 2011 09:37AM

The other day, I had an idea. It occurred to me that daily horoscopes - the traditional kind - might not be as useless as they seem at first glance: They usually give, or at least hint at, suggestions for specific things to do on a given day, which can be a useful cue, allowing the user to put less effort into finding something useful to do with their time. They can also act as a reminder of important concepts, rather like spaced repetition, and have the possibility of serendipitously giving the perfect advice in a situation where the user would otherwise not have thought to apply a particular concept.

This seems like something that many people here would find useful, if they weren't so vague, and if they were better calibrated to make useful suggestions. So, after getting some feedback, and with the help of PeerInfinity (who did most of the coding and is currently hosting the program), I put together a tool to provide us with a daily 'horoscope', chosen from a list provided by us and weighted toward advice that has been reported to work. The horoscopes are displayed here, with an RSS feed available here. Lists of the horoscopes in the program's database can be found here, with various sorting options.

One of the features of this program is that the chance of a given horoscope being displayed are affected by how well it has worked in the past. Every day, there is an option to vote on the previous day's horoscope, rating it as 'harmful', 'useless', 'sort of useful', 'useful', or 'awesome'. The 'harmful' and 'useless' options give the horoscope -15 and -1 points respectively, while the other three give it 1, 3, or 10 points. If a horoscope's score becomes negative, it is removed from the pool of active horoscopes; otherwise, its chance of being chosen is based on the average value of the votes it has received compared to the other horoscopes, disregarding recently-used ones.

There is still a need for good horoscopes to be added to the database. Horoscopes should offer a specific suggestion for something to do that will take less than an hour of sustained effort (all-day mindfulness-type exercises or 'be on the lookout for X' are fine) and that can be accomplished on the same day that the horoscope is read. Horoscopes should not make actual predictions, but may make prediction-like statements that are likely to be true on any given day, like "you will talk to a friend today". Horoscopes can be submitted here, or left in the comments. EDIT: Any comment anywhere on the site that contains the phrase "Horoscope version:" or "Horoscope:" should now automatically be emailed to me, so feel free to horoscope-ify new posts in their comments, unless this comes to be considered spam.

Religion's Claim to be Non-Disprovable

124 Eliezer_Yudkowsky 04 August 2007 03:21AM

The earliest account I know of a scientific experiment is, ironically, the story of Elijah and the priests of Baal.

The people of Israel are wavering between Jehovah and Baal, so Elijah announces that he will conduct an experiment to settle it - quite a novel concept in those days!  The priests of Baal will place their bull on an altar, and Elijah will place Jehovah's bull on an altar, but neither will be allowed to start the fire; whichever God is real will call down fire on His sacrifice.  The priests of Baal serve as control group for Elijah - the same wooden fuel, the same bull, and the same priests making invocations, but to a false god.  Then Elijah pours water on his altar - ruining the experimental symmetry, but this was back in the early days - to signify deliberate acceptance of the burden of proof, like needing a 0.05 significance level.  The fire comes down on Elijah's altar, which is the experimental observation. The watching people of Israel shout "The Lord is God!" - peer review.

And then the people haul the 450 priests of Baal down to the river Kishon and slit their throats.  This is stern, but necessary.  You must firmly discard the falsified hypothesis, and do so swiftly, before it can generate excuses to protect itself.  If the priests of Baal are allowed to survive, they will start babbling about how religion is a separate magisterium which can be neither proven nor disproven.

continue reading »

The Neglected Virtue of Scholarship

177 lukeprog 05 January 2011 07:22AM

Eliezer Yudkowsky identifies scholarship as one of the Twelve Virtues of Rationality:

Study many sciences and absorb their power as your own. Each field that you consume makes you larger... It is especially important to eat math and science which impinges upon rationality: Evolutionary psychology, heuristics and biases, social psychology, probability theory, decision theory. But these cannot be the only fields you study...

I think he's right, and I think scholarship doesn't get enough praise - even on Less Wrong, where it is regularly encouraged.

First, consider the evangelical atheist community to which I belong. There is a tendency for lay atheists to write "refutations" of theism without first doing a modicum of research on the current state of the arguments. This can get atheists into trouble when they go toe-to-toe with a theist who did do his homework. I'll share two examples:

  • In a debate with theist Bill Craig, agnostic Bart Ehrman paraphrased David Hume's argument that we can't demonstrate the occurrence of a miracle in the past. Craig responded with a PowerPoint slide showing Bayes' Theorem, and explained that Ehrman was only considering prior probabilities, when of course he needed to consider the relevant conditional probabilities as well. Ehrman failed to respond to this, and looked as though he had never seen Bayes' Theorem before. Had Ehrman practiced the virtue of scholarship on this issue, he might have noticed that much of the scholarly work on Hume's argument in the past two decades has involved Bayes' Theorem. He might also have discovered that the correct response to Craig's use of Bayes' Theorem can be found in pages 298-341 of J.H. Sobel’s Logic and Theism.

  • In another debate with Bill Craig, atheist Christopher Hitchens gave this objection: "Who designed the Designer? Don’t you run the risk… of asking 'Well, where does that come from? And where does that come from?' and running into an infinite regress?" But this is an elementary misunderstanding in philosophy of science. Why? Because every successful scientific explanation faces the exact same problem. It’s called the “why regress” because no matter what explanation is given of something, you can always still ask “Why?” Craig pointed this out and handily won that part of the debate. Had Hitchens had a passing understanding of science or explanation, he could have avoided looking foolish, and also spent more time on substantive objections to theism. (One can give a "Who made God?" objection to theism that has some meat, but that's not the one Hitchens gave. Hitchens' objection concerned an infinite regress of explanations, which is just as much a feature of science as it is of theism.)

The lesson I take from these and a hundred other examples is to employ the rationality virtue of scholarship. Stand on the shoulders of giants. We don't each need to cut our own path into a subject right from the point of near-total ignorance. That's silly. Just catch the bus on the road of knowledge paved by hundreds of diligent workers before you, and get off somewhere near where the road finally fades into fresh jungle. Study enough to have a view of the current state of the debate so you don't waste your time on paths that have already dead-ended, or on arguments that have already been refuted. Catch up before you speak up.

This is why, in more than 1000 posts on my own blog, I've said almost nothing that is original. Most of my posts instead summarize what other experts have said, in an effort to bring myself and my readers up to the level of the current debate on a subject before we try to make new contributions to it.

The Less Wrong community is a particularly smart and well-read bunch, but of course it doesn't always embrace the virtue of scholarship.

Consider the field of formal epistemology, an entire branch of philosophy devoted to (1) mathematically formalizing concepts related to induction, belief, choice, and action, and (2) arguing about the foundations of probability, statistics, game theory, decision theory, and algorithmic learning theory. These are central discussion topics at Less Wrong, and yet my own experience suggests that most Less Wrong readers have never heard of the entire field, let alone read any works by formal epistemologists, such as In Defense of Objective Bayesianism by Jon Williamson or Bayesian Epistemology by Luc Bovens and Stephan Hartmann.

continue reading »

The Trolley Problem: Dodging moral questions

13 Desrtopa 05 December 2010 04:58AM

The trolley problem is one of the more famous thought experiments in moral philosophy, and studies by psychologists and anthropologists suggest that the response distributions to its major permutations remain roughly the same throughout all human cultures. Most people will permit pulling the lever to redirect the trolley so that it will kill one person rather than five, but will balk at pushing one fat person in front of the trolley to save the five if that is the only available option of stopping it.

However, in informal settings, where the dilemma is posed by a peer rather than a teacher or researcher, it has been my observation that there is another major category which accounts for a significant proportion of respondents' answers. Rather than choosing to flip the switch, push the fat man, or remain passive, many people will reject the question outright. They will attack the improbability of the premise, attempt to invent third options, or appeal to their emotional state in the provided scenario ("I would be too panicked to do anything",) or some combination of the above, in order to opt out of answering the question on its own terms.

continue reading »

"Nahh, that wouldn't work"

63 lionhearted 28 November 2010 09:32PM

After having it recommended to me for the fifth time, I finally read through Harry Potter and the Methods of Rationality. It didn't seem like it'd be interesting to me, but I was really mistaken. It's fantastic.

One thing I noticed is that Harry threatens people a lot. My initial reaction was, "Nahh, that wouldn't work."

It wasn't to scrutinize my own experience. It wasn't to do a google search if there's literature available. It wasn't to ask a few friends what their experiences were like and compare them.

After further thought, I came to realization - almost every time I've threatened someone (which is rarely), it's worked. Now, I'm kind of tempted to write that off as "well, I had the moral high ground in each of those cases" - but:

1. Harry usually or always has the moral high ground when he threatens people in MOR.

2. I don't have any personal anecdotes or data about threatening people from a non-moral high ground, but history provides a number of examples, and the threats often work.

This gets me to thinking - "Huh, why did I write that off so fast as not accurate?" And I think the answer is because I don't want the world to work like that. I don't want threatening people to be an effective way of communicating.

It's just... not a nice idea.

And then I stop, and think. The world is as it is, not as I think it ought to be.

And going further, this makes me consider all the times I've tried to explain something I understood to someone, but where they didn't like the answer. Saying things like, "People don't care about your product features, they care about what benefit they'll derive in their own life... your engineering here is impressive, but 99% of people don't care that you just did an amazing engineering feat for the first time in history if you can't explain the benefit to them."

Of course, highly technical people hate that, and tend not to adjust.

Or explaining to someone how clothing is a tool that changes people's perceptions of you, and by studying the basics of fashion and aesthetics, you can achieve more of your aims in life. Yes, it shouldn't be like that in an ideal world. But we're not in that ideal world - fashion and aesthetics matter and people react to it.

I used to rebel against that until I wizened up, studied a little fashion and aesthetics, and started dressing to produce outcomes. So I ask, what's my goal here? Okay, what kind of first impression furthers that goal? Okay, what kind of clothing helps make that first impression?

Then I wear that clothing.

And yet, when confronted with something I don't like - I dismiss it out of hand, without even considering my own past experiences. I think this is incredibly common. "Nahh, that wouldn't work" - because the person doesn't want to live in a world where it would work.

Defecting by Accident - A Flaw Common to Analytical People

86 lionhearted 01 December 2010 08:25AM

Related to: Rationalists Should WinWhy Our Kind Can't Cooperate, Can Humanism Match Religion's Output?, Humans Are Not Automatically Strategic, Paul Graham's "Why Nerds Are Unpopular"

The "Prisoner's Dilemma" refers to a game theory problem developed in the 1950's. Two prisoners are taken and interrogated separately. If either of them confesses and betrays the other person - "defecting" - they'll receive a reduced sentence, and their partner will get a greater sentence. However, if both defect, then they'll both receive higher sentences than if neither of them confessed.

This brings the prisoner to a strange problem. The best solution individually is to defect. But if both take the individually best solution, then they'll be worst off overall. This has wide ranging implications for international relations, negotiation, politics, and many other fields.

Members of LessWrong are incredibly smart people who tend to like game theory, and debate and explore and try to understand problems like this. But, does knowing game theory actually make you more effective in real life?

I think the answer is yes, with a caveat - you need the basic social skills to implement your game theory solution. The worst-case scenario in an interrogation would be to "defect by accident" - meaning that you'd just blurt out something stupidly because you didn't think it through before speaking. This might result in you and your partner both receiving higher sentences... a very bad situation. Game theory doesn't take over until basic skill conditions are met, so that you could actually execute any plan you come up with.

The Purpose of This Post: I think many smart people "defect" by accident. I don't mean in serious situations like a police investigation. I mean in casual, everyday situations, where they tweak and upset people around them by accident, due to a lack of reflection of desired outcomes.

Rationalists should win. Defecting by accident frequently results in losing. Let's examine this phenomenon, and ideally work to improve it.

Contents Of This Post

  • I'll define "defecting by accident."
  • I'll explain a common outcome of defecting by accident.
  • I'll give some recent, mild examples of accidental defections.
  • I'll give examples of how to turn accidental defections into cooperation.
  • I'll give some examples of how this can make you more successful at your goals.
  • I'll list some books I recommend if you decide to learn more on the topic.
continue reading »

Reason as memetic immune disorder

215 PhilGoetz 19 September 2009 09:05PM

A prophet is without dishonor in his hometown

I'm reading the book "The Year of Living Biblically," by A.J. Acobs.  He tried to follow all of the commandments in the Bible (Old and New Testaments) for one year.  He quickly found that

  • a lot of the rules in the Bible are impossible, illegal, or embarassing to follow nowadays; like wearing tassels, tying your money to yourself, stoning adulterers, not eating fruit from a tree less than 5 years old, and not touching anything that a menstruating woman has touched; and
  • this didn't seem to bother more than a handful of the one-third to one-half of Americans who claim the Bible is the word of God.

You may have noticed that people who convert to religion after the age of 20 or so are generally more zealous than people who grew up with the same religion.  People who grow up with a religion learn how to cope with its more inconvenient parts by partitioning them off, rationalizing them away, or forgetting about them.  Religious communities actually protect their members from religion in one sense - they develop an unspoken consensus on which parts of their religion members can legitimately ignore.  New converts sometimes try to actually do what their religion tells them to do.

I remember many times growing up when missionaries described the crazy things their new converts in remote areas did on reading the Bible for the first time - they refused to be taught by female missionaries; they insisted on following Old Testament commandments; they decided that everyone in the village had to confess all of their sins against everyone else in the village; they prayed to God and assumed He would do what they asked; they believed the Christian God would cure their diseases.  We would always laugh a little at the naivete of these new converts; I could barely hear the tiny voice in my head saying but they're just believing that the Bible means what it says...

How do we explain the blindness of people to a religion they grew up with?

continue reading »

You cannot be mistaken about (not) wanting to wirehead

34 Kaj_Sotala 26 January 2010 12:06PM

In the comments of Welcome to Heaven, Wei Dai brings up the argument that even though we may not want to be wireheaded now, our wireheaded selves would probably prefer to be wireheaded. Therefore we might be mistaken about what we really want. (Correction: what Wei actually said was that an FAI might tell us that we would prefer to be wireheaded if we knew what it felt like, not that our wireheaded selves would prefer to be wireheaded.)

This is an argument I've heard frequently, one which I've even used myself. But I don't think it holds up. More generally, I don't think any argument that says one is wrong about what they want holds up.

To take the example of wireheading. It is not an inherent property of minds that they'll become desperately addicted to anything that feels sufficiently good. Even from our own experience, we know that there are plenty of things that feel really good, but we don't immediately crave for more afterwards. Sex might be great, but you can still afterwards get fatigued enough that you want to rest; eating good food might be enjoyable, but at some point you get full. The classic counter-example is that of the rats who could pull a lever stimulating a part of their brain, and ended up compulsively pulling it, to the exclusion of all else. People thought this to mean they were caught in a loop of stimulating their "pleasure center", but it later turned out that wasn't the case. Instead, the rats were stimulating their "wants to seek out things -center".

The systems for experiencing pleasure and for wanting to seek out pleasure are separate ones. One can find something pleasurable, but still not develop a desire to seek it out. I'm sure all of you have had times when you haven't felt the urge to participate in a particular activity, even though you knew you'd enjoy the activity in question if you just got around doing it. Conversly, one can also have a desire to seek out something, but still not find it pleasurable when it's achieved.

Therefore, it is not an inherent property of wireheading that we'd automatically end up wanting it. Sure, you could wirehead someone in such a way that the person stopped wanting anything else, but you could also wirehead them in such a way that they were indifferent to whether or not it continued. You could even wirehead them in such a way that they enjoyed every minute of it, but at the same time wanted it to stop.

continue reading »