Less Wrong is a community blog devoted to refining the art of human rationality. Please visit our About page for more information.

Lumifer comments on High Challenge - Less Wrong

22 Post author: Eliezer_Yudkowsky 19 December 2008 12:51AM

You are viewing a comment permalink. View the original post to see all comments and the full post content.

Comments (72)

Sort By: Old

You are viewing a single comment's thread. Show more comments above.

Comment author: Lumifer 03 November 2015 05:38:29PM *  2 points [-]

How can we build a society/world where there are strong optimization forces to enable people to choose System 2 preferences?

I think the real world qualifies quite well. People who listen to their System 2 achieve much more than people who are slaves to their System 1.

If you want stronger "optimization forces", take away the safety net. Hunger and pain are excellent incentives. Not many people would allow themselves to get addicted to WoW if it means they'll become homeless in a short while.

Comment author: MarsColony_in10years 03 November 2015 09:07:09PM 2 points [-]

That provided me with some perspective. I'd only been thinking of cases where we imposed limitations, such as those we use with Alcohol and addictive drugs. But, as you point out, there are also regulations which push us toward immediate gratification, rather than away. If, after much deliberation, we collectively decide that 99% of potential values are long term, then perhaps we'd wind up abolishing most or all such regulations, assuming that most System 2 values would benefit.

However, at least some System 2 values are likely orthogonal to these sorts of motivators. For instance, perhaps NaNoWriMo participation would go down in a world with fewer social and economic safety nets, since many people would be struggling up Maslow's Hierarchy of Needs instead of writing. I'm not sure how large of a fraction of System 2 values would be aided by negative reinforcement. There would be a large number of people who would abandon their long-term goals in order to remove the negative stimuli ASAP. If the shortest path to removing the stimuli gets them 90% of the way toward a goal, then I'd expect most people to achieve the remaining 10%. However, for goals that are orthogonal to pain and hunger, we might actually expect a lower rate of achievement.

If descriptive ethics research shows that System 2 preferences dominate, and if the majority of that weighted value is held back by safety nets, then it'll be time to start cutting through red tape. If System 2 preferences dominate, and the majority of moral weight is supported by safety nets, then perhaps we need more cushions or even Basic Income. If our considered preference is actually to "live in the moment" (System 1 preferences dominate) then perhaps we should optimize for wireheading, or whatever that utopia would look like.

More likely, this is an overly simplified model, and there are other concerns that I'm not taking into account but which may dominate the calculation. I completely missed the libertarian perspective, after all.

Comment author: [deleted] 11 November 2015 12:27:02AM 0 points [-]

If you want stronger "optimization forces", take away the safety net. Hunger and pain are excellent incentives.

Actual experiments in doing this have proven it to be extremely counterproductive. The more human effort needs to be poured into avoiding hunger, homelessness, and base pain, the less ends up available for serving "self-actualizing" goals, conforming to socially-approved-of lifestyles, or even increasing economic productivity.

If you have an intuition which tells you that punishing people makes them act smarter, it is wrong. Punishing people makes them spend mental effort on avoiding getting caught transgressing your norms when they could have spent that effort on something that was actually important.

Comment author: Lumifer 11 November 2015 04:45:26PM *  2 points [-]

the less ends up available for serving "self-actualizing" goals, conforming to socially-approved-of lifestyles,

LOL. For a lot of people "self-actualization" ends up with sitting on a couch in front of an idiot box, eating chips. Nowadays it might be in front of their FB feed, but that's essentially the same. And I'm not sure what are "socially-approved-of lifestyles" -- that seem to depend a lot on the society in question.

If you have an intuition which tells you that punishing people makes them act smarter, it is wrong.

No. My intuition is that the threat of pain/hunger/etc. makes people act. Incentives matter.

Comment author: [deleted] 13 November 2015 03:27:37AM -2 points [-]

LOL. For a lot of people "self-actualization" ends up with sitting on a couch in front of an idiot box, eating chips. Nowadays it might be in front of their FB feed, but that's essentially the same. And I'm not sure what are "socially-approved-of lifestyles" -- that seem to depend a lot on the society in question.

Look, the mere fact that you condescend at and disapprove of the actions of others doesn't mean you've proposed any kind of alternative (no, survivalism does not count, that problem was already solved), let alone demonstrated a metric by which your non-proposed alternative is superior (not even the "I like it" metric).

No. My intuition is that the threat of pain/hunger/etc. makes people act. Incentives matter.

Now explain why those actions or incentives matter, that is, what makes them superior to alternatives. No, sneering does not count.

Comment author: ChristianKl 11 November 2015 06:28:05PM 1 point [-]

I think the real world qualifies quite well. People who listen to their System 2 achieve much more than people who are slaves to their System 1.

I think people who mainly listen to system 2 frequently suffer from akrasia. Productive people usually feel motivated to do what they are doing and that's system 1.