30 minute panel

The first question was why isn't everyone trying to change the world, with the underlying assumption that everyone should be. However, it isn't obviously the case that the world would be better if everyone were trying to change it. For one thing, trying to change the world mostly means trying to change other people. If everyone were trying to do it, this would be a huge drain on everyone's attention. In addition, some people are sufficiently mean and/or stupid that their efforts to change the world make things worse.

At the same time, some efforts to change the world are good, or at least plausible. Is there any way to improve the filter so that we get more ambition from benign people without just saying everyone should try to change the world, even if they're Osama bin Laden?

The discussion of why there's too much duplicated effort in science didn't bring up the problem of funding, which is probably another version of the problem of people not doing enough independent thinking.

There was some discussion of people getting too hooked on competition, which is a way of getting a lot of people pointed at the same goal. 

Link thanks to Clarity

New Comment
11 comments, sorted by Click to highlight new comments since: Today at 6:33 AM
[-]gjm9y210

(From Singularity Summit 2009; uploaded to YouTube in February 2012.)

The first question was why isn't everyone trying to change the world, with the underlying assumption that everyone should be.

Not everyone cares that much about "the world", and that's likely a good thing.

I would generalize the question, because the generalization applies broadly - whatever we profess to value, whatever we believe we value, why aren't we expending more time and energy actually creating that value?

One answer that pops to mind runs against encouraging people to "save the world" - that caring about things beyond your control largely only prevents you from caring about, and thereby having the motivation to act, on things in your control.

"I would generalize the question, because the generalization applies broadly - whatever we profess to value, whatever we believe we value, why aren't we expending more time and energy actually creating that value?"

Okay I'll bite.

I'm not going to the gym because of the tragedy of the commons? I really don't think that's it.

Well, you value whatever you're doing instead of going to the gym more than you value saving the world is what I was getting at. People value their own expected utility more than they value societal expected utility, even if most individuals in society would be better off if everyone somehow valued societal expected utility. Seems reasonably likely to me anyways.

Perhaps I misunderstood you; now it seems like you were saying "why do we hyperbolic discount?" or something to that effect.

Just wanted to mention that watching this panel was one of the things that convinced me to give AI safety research a try :) Thanks for re-posting, it's a good memory.

To at least try to address your question: one effect could be that there are coordination problems, where many people would be trying to "change the world" in roughly the same direction if they knew that other people would cooperate and work with them. This would result in less of the attention drain you suggest. This seems more like what I've experienced.

I'm more worried about people being stupid than mean, but that could be an effect of the bubble of non-mean people I'm part of.

Is there any way to improve the filter so that we get more ambition from benign people without just saying everyone should try to change the world, even if they're Osama bin Laden?

I think it's hard to think about people like Osama bin Laden because there's a lot of spin around him. It's worth to read Gwern's post about terrorism http://www.gwern.net/Terrorism%20is%20not%20Effective . Osama Bin Ladin likely spent ten millions of dollars but still only invested 500k into 9/11 and didn't spent another 500k the next year to get a similar attack.

Then it's worth thinking about psychopaths. They don't seem to be motivated by having a long term vision of having something to protect. Eliezer made it Harry's power that the dark lord hasn't. Finding something to care about seems to be part of the CFAR curriculum. It's not something that a psychopaths can simply do. Having social norms of high physical contact likely produces an enviroment in which psychopaths don't feel well and rather want to stay out.

Having social norms of high physical contact likely produces an enviroment in which psychopaths don't feel well

I'd like to have some source on this.

From what I heard about psychopaths, they are uncomfortable with describing their feelings, because their feelings are different from the neurotypical person; and they don't want to be exposed. (Ironically, various group therapy sessions with neurotypical people in the group solve this problems for psychopaths, because they get enough data so they can better fake having the usual feelings.) But I haven't heard anything about physical contact.

I'd like to have some source on this.

At the moment I didn't find good sources so I opened a Stackoverflow question.

Ironically, various group therapy sessions with neurotypical people in the group solve this problems for psychopaths, because they get enough data so they can better fake having the usual feelings.

That's true for standard group therapy where people just pay attention to the words that are spoken and how they are spoken. As a result standard group therapy done in prisons seems to raise

On the other hand it's very hard to fake emotions when hugging another person who is perceptive because the person can use their hands to feel what's going on inside your body. A person is effectively exposed in a good hug.

I haven't watched the video. Does being an effective altruist count as trying to change the world (say, into a less malarial version of itself)?

The answer isn't obvious, so I think you'd do better to listen to the panel than to rely on my interpretation.