Less Wrong is a community blog devoted to refining the art of human rationality. Please visit our About page for more information.

Roland2 comments on High Challenge - Less Wrong

22 Post author: Eliezer_Yudkowsky 19 December 2008 12:51AM

You are viewing a comment permalink. View the original post to see all comments and the full post content.

Comments (72)

Sort By: Old

You are viewing a single comment's thread.

Comment author: Roland2 19 December 2008 05:12:43AM 6 points [-]

@ D. Alex: Some important reasons why the game is so pleasurable seem to be:

a) the ultimate goals are pretty clear (so unlike real life...)

b) the "measures of progress" are likewise clear -

c) the rewards are clear -

This looks like real life without the hard parts. Sure, it makes it more fun, but at the end will you feel rewarded? If you look back now or in a few years to the time spent playing and consider what you could have achieved in real life if you invested the same time into real challenges how will you feel? From my own experience I can tell you that I regret every minute I wasted playing stupid games. Nowadays I still play chess ocasionally to relax, but I'm successfully getting rid of that habit. I avoid overly immersive/addictive games like the plague.

Comment author: MarsColony_in10years 03 November 2015 05:19:55PM 0 points [-]

Sounds like WoW is optimized for System 1 pleasures, and you explicitly reject this. I think that brings up an important point: How can we build a society/world where there are strong optimization forces to enable people to choose System 2 preferences? Once such a world iterated on itself for a couple generations, what might it look like?

I don’t think this would be a world with no WoW-like activities, because a world without any candy or simple pleasures strikes me as deeply lacking. My System 2 seems to place at least a little value on System 1 being happy. So I’d guess the world would just have many fewer of such activities, and be structured in such a way as to make it easy to avoid choices we’d regret the next day.

If this turns out to a physically impossible problem to overcome for some reason, then I could imagine a world with no System 1 pleasures, but such a world would be deeply lacking, even if that loss was more than made up for by gains in our System 2 values.

As a side note, it'd be an interesting question how much of the theoretical per capita maximum value falls into which categories. An easier question is how much of our currently actualized value is immediate gratifications. I'd expect that to be heavily biased toward System 1, since we suffer from Akrasia, but it might still be informative.

Comment author: Lumifer 03 November 2015 05:38:29PM *  2 points [-]

How can we build a society/world where there are strong optimization forces to enable people to choose System 2 preferences?

I think the real world qualifies quite well. People who listen to their System 2 achieve much more than people who are slaves to their System 1.

If you want stronger "optimization forces", take away the safety net. Hunger and pain are excellent incentives. Not many people would allow themselves to get addicted to WoW if it means they'll become homeless in a short while.

Comment author: MarsColony_in10years 03 November 2015 09:07:09PM 2 points [-]

That provided me with some perspective. I'd only been thinking of cases where we imposed limitations, such as those we use with Alcohol and addictive drugs. But, as you point out, there are also regulations which push us toward immediate gratification, rather than away. If, after much deliberation, we collectively decide that 99% of potential values are long term, then perhaps we'd wind up abolishing most or all such regulations, assuming that most System 2 values would benefit.

However, at least some System 2 values are likely orthogonal to these sorts of motivators. For instance, perhaps NaNoWriMo participation would go down in a world with fewer social and economic safety nets, since many people would be struggling up Maslow's Hierarchy of Needs instead of writing. I'm not sure how large of a fraction of System 2 values would be aided by negative reinforcement. There would be a large number of people who would abandon their long-term goals in order to remove the negative stimuli ASAP. If the shortest path to removing the stimuli gets them 90% of the way toward a goal, then I'd expect most people to achieve the remaining 10%. However, for goals that are orthogonal to pain and hunger, we might actually expect a lower rate of achievement.

If descriptive ethics research shows that System 2 preferences dominate, and if the majority of that weighted value is held back by safety nets, then it'll be time to start cutting through red tape. If System 2 preferences dominate, and the majority of moral weight is supported by safety nets, then perhaps we need more cushions or even Basic Income. If our considered preference is actually to "live in the moment" (System 1 preferences dominate) then perhaps we should optimize for wireheading, or whatever that utopia would look like.

More likely, this is an overly simplified model, and there are other concerns that I'm not taking into account but which may dominate the calculation. I completely missed the libertarian perspective, after all.

Comment author: [deleted] 11 November 2015 12:27:02AM 0 points [-]

If you want stronger "optimization forces", take away the safety net. Hunger and pain are excellent incentives.

Actual experiments in doing this have proven it to be extremely counterproductive. The more human effort needs to be poured into avoiding hunger, homelessness, and base pain, the less ends up available for serving "self-actualizing" goals, conforming to socially-approved-of lifestyles, or even increasing economic productivity.

If you have an intuition which tells you that punishing people makes them act smarter, it is wrong. Punishing people makes them spend mental effort on avoiding getting caught transgressing your norms when they could have spent that effort on something that was actually important.

Comment author: Lumifer 11 November 2015 04:45:26PM *  2 points [-]

the less ends up available for serving "self-actualizing" goals, conforming to socially-approved-of lifestyles,

LOL. For a lot of people "self-actualization" ends up with sitting on a couch in front of an idiot box, eating chips. Nowadays it might be in front of their FB feed, but that's essentially the same. And I'm not sure what are "socially-approved-of lifestyles" -- that seem to depend a lot on the society in question.

If you have an intuition which tells you that punishing people makes them act smarter, it is wrong.

No. My intuition is that the threat of pain/hunger/etc. makes people act. Incentives matter.

Comment author: [deleted] 13 November 2015 03:27:37AM -2 points [-]

LOL. For a lot of people "self-actualization" ends up with sitting on a couch in front of an idiot box, eating chips. Nowadays it might be in front of their FB feed, but that's essentially the same. And I'm not sure what are "socially-approved-of lifestyles" -- that seem to depend a lot on the society in question.

Look, the mere fact that you condescend at and disapprove of the actions of others doesn't mean you've proposed any kind of alternative (no, survivalism does not count, that problem was already solved), let alone demonstrated a metric by which your non-proposed alternative is superior (not even the "I like it" metric).

No. My intuition is that the threat of pain/hunger/etc. makes people act. Incentives matter.

Now explain why those actions or incentives matter, that is, what makes them superior to alternatives. No, sneering does not count.

Comment author: ChristianKl 11 November 2015 06:28:05PM 1 point [-]

I think the real world qualifies quite well. People who listen to their System 2 achieve much more than people who are slaves to their System 1.

I think people who mainly listen to system 2 frequently suffer from akrasia. Productive people usually feel motivated to do what they are doing and that's system 1.

Comment author: VoiceOfRa 10 November 2015 12:23:21AM 4 points [-]

The biggest problem isn't System 1 dominating System 2. It's system 2's being filled with BS and falsehoods.

Comment author: MarsColony_in10years 10 November 2015 03:09:30PM 2 points [-]

Excellent point. Most people aren't trying and failing to achieve their dreams. We aren’t even trying. We don’t have well-articulated dreams, so trying isn’t even a reasonable course of action until we have a clear objective. I'd guess that most adults still don't know what they want to be when they grow up, and still haven't figured it out by the time they retire.

Comment author: Lumifer 10 November 2015 06:00:36PM 0 points [-]

Most people aren't trying and failing to achieve their dreams. We aren’t even trying. We don’t have well-articulated dreams

Evidence or typical mind fallacy..? X-)

Comment author: MarsColony_in10years 10 November 2015 06:59:21PM *  1 point [-]

Guilty. I've spent most of my life trying to articulate and rigorously define what our goals should be. It takes an extra little bit of cognitive effort to model others as lacking that sense of purpose, rather than merely having lots of different well-defined goals.

(EDIT, to avoid talking past each other: Not that people don't have any well defined sub-goals, mind you. Just not well defined terminal values, and well defined knowledge of their utility function. No well-defined answers to Life, The Universe, And Everything.)

Comment author: Lumifer 10 November 2015 07:17:44PM *  1 point [-]

"Well-defined terminal values" are a very different thing from "well-articulated dreams".

P.S. People with "well-defined answers to Life, The Universe, And Everything" are usually pretty scary.

Comment author: VoiceOfRa 11 November 2015 12:08:11AM 2 points [-]

Or have dreams that would be horrific if actually implemented because they haven't thought through the implications.