Less Wrong is a community blog devoted to refining the art of human rationality. Please visit our About page for more information.

Memetic Hazards in Videogames

70 Post author: jimrandomh 10 September 2010 02:22AM

Hello, player character, and welcome to the Mazes of Menace! Your goal is to get to the center and defeat the Big Bad. You know this is your goal because you received a message from a very authoritative source that said so. Alas, the maze is filled with guards and traps that make every step dangerous. You have reached an intersection, and there are two doors before you. Door A leads towards the center; it probably takes you to your destination. Door B leads away from the center; it could loop back, but it's probably a dead end. Which door do you choose?

The correct answer, and the answer which every habitual video game player will instinctively choose, is door B: the probable dead end. Because your goal is not to reach the end quickly, but to search as much of the maze's area as you can, and by RPG genre convention, dead ends come with treasure. Similarly, if you're on a quest to save the world, you do side-quests to put it off as long as possible, because you're optimizing for fraction-of-content-seen, rather than probability-world-is-saved, which is 1.0 from the very beginning.

If you optimize for one thing, while thinking that you're optimizing something else, then you may generate incorrect subgoals and heuristics. If seen clearly, the doors represent a trade-off between time spent and area explored. But what happens if that trade-off is never acknowledged, and you can't see the situation for what it really is? Then you're loading garbage into your goal system. I'm writing this because someone reported what looks like a video game heuristic leaking into the real world. While this hasn't been studied, it could plausibly be a common problem. Here are some of the common memetic hazards I've found in video games.

For most games, there's a guide that explains exactly how to complete your objective perfectly, but to read it would be cheating. Your goal is not to master the game, but to experience the process of mastering the game as laid out by the game's designers, without outside interference. In the real world, if there's a guide for a skill you want to learn, you read it.

Permanent choices can be chosen arbitrarily on a whim, or based solely on what you think best matches your style, and you don't need to research which is better. This is because in games, the classes, skills, races and alignments are meant to be balanced, so they're all close to equally good. Applying this reasoning to the real world would mean choosing a career without bothering to find out what sort of salary and lifestyle it supports; but things in the real world are almost never balanced in this sense. (Many people, in fact, do not do this research, which is why colleges turn out so many English majors.)

Tasks are arranged in order of difficulty, from easiest to hardest. If you try something and it's too hard, then you must have taken a wrong turn into an area you're not supposed to be. When playing a game, level ten is harder than level nine, and a shortcut from level one to level ten is a bad idea. Reality is the opposite; most of the difficulty comes up front, and it gets easier as you learn. When writing a book, chapter ten is easier than writing chapter nine. Games teach us to expect an easy start, and a tough finale; this makes the tough starts reality offers more discouraging.

You shouldn't save gold pieces, because they lose their value quickly to inflation as you level. Treating real-world currency that way would be irresponsible. You should collect junk, since even useless items can be sold to vendors for in-game money. In the real world, getting rid of junk costs money in effort and disposal fees instead.

These oddities are dangerous only when they are both confusing and unknown, and to illustrate the contrast, here is one more example. There are hordes of creatures that look just like humans, except that they attack on sight and have no moral significance. Objects which are not nailed down are unowned and may be claimed without legal repercussions, and homes which are not locked may be explored. But no one would ever confuse killing an NPC for real murder, nor clicking an item for larceny, nor exploring a level for burglary; these actions are so dissimilar that there is no possible confusion.

But remember that search is not like exploration, manuals are not cheats, careers are not balanced, difficulty is front-loaded, and dollars do not inflate like gold pieces. Because these distinctions are tricky, and failing to make them can have consequences.

Comments (155)

Comment author: Eliezer_Yudkowsky 10 September 2010 03:48:04AM 121 points [-]

Have I ever remarked on how completely ridiculous it is to ask high school students to decide what they want to do with the rest of their lives and give them nearly no support in doing so?

Support like, say, spending a day apiece watching twenty different jobs and then another week at their top three choices, with salary charts and projections and probabilities of graduating that subject given their test scores? The more so considering this is a central allocation question for the entire economy?

Comment author: Will_Newsome 10 September 2010 10:12:48PM 21 points [-]

At my high school the gifted program required a certain number of hours of internship at a company in the area, and indeed even those outside the gifted program were encouraged to meet with their counselors for advice on finding internships. 'Course, that program, along with AP classes, the arts, and half of science, was cut starting this school year. I think it's 'cuz Arizona realized that since they were already by far the worst state in the nation when it came to education they might as well heed the law of comparative advantage and allocate more resources to the harassment of cryonics institutions.

Comment author: dclayh 10 September 2010 10:57:40PM 11 points [-]

Indeed, some of us spend 9 more years in school to postpone this decision. (In case you were wondering, it doesn't help.)

Comment author: JoshuaZ 10 September 2010 01:10:58PM 9 points [-]

Have I ever remarked on how completely ridiculous it is to ask high school students to decide what they want to do with the rest of their lives and give them nearly no support in doing so?

Do we actually do that that much? The vast majority of high school students when I was in highschool had no idea what they wanted to do, and that was considered ok. Heck, a large fraction of people even when they were well into their undergraduate educations didn't know what they wanted to do and that was also considered ok. And as far as I can tell the general trend in high school education has been less emphasis on specific-job oriented classes as time has gone on.

Comment author: Relsqui 14 September 2010 10:32:32AM *  1 point [-]

The trend in reporting about education certainly seems to be that kids are being asked to specialize earlier and earlier--taking AP classes to prepare for majors, etc. Whether that corresponds to the actual advisement trends I couldn't tell you. I only went through it once.

Comment author: Morendil 10 September 2010 08:08:05AM *  9 points [-]

how completely ridiculous it is to ask high school students to decide what they want to do with the rest of their lives

One common answer to that is to become a dropout, try a career or two to find out where your talents really lie, and then go for that. You can usually go back to school for an education when you've figured which one you need.

It doesn't even seem as if it would be very hard to build that right into the system. Doing it the artisanal way takes longer, generates more stress, loses more income.

Tentatively, thinking of my own experience, I'd point to the competitiveness of the system as the driving force. I had some smarts but school didn't suit me much. There were a bunch of things I was interested in - computers, AI, writing sci-fi, evolutionary biology - and I had no clear idea what I should do when I turned 18.

My parents' reasoning was "Most of your interests are scientific, so, the best way to keep your options open is to enrol in the top engineering schools, then you can have your pick of careers later". One problem with that is that these schools aren't a place for learning while you keep your options open. They are, basically, a sorting process, getting students to compete and ranking them so that they can eject the bottom tier, direct the middle tiers to various jobs and the top tier to yet another sorting process.

The material is taught more in video-game order than in the order which would optimize for deep comprehension - that's what turned me away from math. And only that material is taught which makes for an efficient sorting process.

Not that any of that is a new observation - "schools aren't about education".

Comment author: sixes_and_sevens 10 September 2010 11:26:29AM 6 points [-]

From this point forward, I'm describing the past ten years of my life as "having taken the artisanal route".

Comment author: Aurini 10 September 2010 05:05:27PM 1 point [-]

I just call myself an 'autodidact'.

Comment author: Relsqui 14 September 2010 10:47:01AM 2 points [-]

One common answer to that is to become a dropout ... [and] go back to school for an education when you've figured which one you need.

Oh, hi. Didn't see you there describing my life. :)

Dropped out towards the end of high school, spent a lot of time unemployed or doing odd jobs, lived off other people, got sick of living off other people, and eventually woke up one morning and developed an idea about what I could do with my life that would fit my goals and suit what I'd learned about who I was (a picture which had changed a fair bit since high school). Long story short, I started college a few weeks ago. I'm trepidatious, because I haven't gotten along well with formal academics historically, but I've also never been there for me before. It's kind of a scary experiment, because I'm playing with real money (most of which isn't mine), but that's also an added incentive not to fail.

(The education I turned out to need to do what I want--if I've planned this out well--turns out to be in communications/language/linguistics. If I'd gone to college right after high school, I would probably have ended up in English or computer science.)

To her credit, the college counselor at my high school (in a mandatory appointment beore I dropped out), recommended that I take some time off, travel, and work before deciding if I wanted to go to college. I guess it was pretty clear from my record that putting me right back into a classroom the following fall wasn't going to be very productive.

Comment author: Sniffnoy 10 September 2010 07:22:38PM 0 points [-]

The material is taught more in video-game order than in the order which would optimize for deep comprehension - that's what turned me away from math.

Can you explain what you mean by this?

Comment author: Morendil 10 September 2010 07:43:20PM 6 points [-]

By "video-game" order I mean in an order which makes it increasingly challenging, as opposed to making it increasingly easy because built on more solid foundations.

For instance (as I dimly remember it), calculus was introduced as a collection of rules, of "things to memorize", rather than worked out from axiomatic principles. It was only later (and as an elective class) that I was introduced to non-standard analysis which provides a rigorous treatment of infinitesimals.

This may be a limitation of mine, but I can only approach math the way I approach coding - I have to know how each layer of abstraction is built atop the underlying one, I'm unable to accept things "on faith" and build upwards from something I don't understand deeply. I can't work with expositions that go "now here we need a crucial result that we cannot prove for now, you'll see the proof next year, but we're going to use this all through this year".

Comment author: DanielLC 12 September 2010 03:23:25AM 5 points [-]

Calculus is built on limits, not infinitesimals. At least, that's how it's normally defined. They both work, and neither was understood when calculus was discovered.

I think most people are fine using the tools without understanding the rules, and find that easier than learning the rules. Schools are built to teach the way that the majority learns best, as it's better than teaching the way that the minority learns best.

Comment author: Morendil 10 September 2010 05:02:25PM *  4 points [-]
Comment author: NihilCredo 10 September 2010 10:57:48AM 4 points [-]

My high school used to organise a Saturday every year when they would invite their old alumni to come and tell any interested students about their academic and/or job experience. Lots of people would come, since it was a good chance to catch up and have a free quality lunch with their old friends and teachers (many of whom were friends too - it was a small, quality school).

The logistics and self-selection effect meant there was an overrepresentation of younger people who still lived in the area (usually working engineering, office, or teaching jobs), but it was still an extremely useful experience.

Comment author: patrissimo 12 September 2010 04:30:03AM 10 points [-]

One possible solution is to have education financed by equity rather than loans, the third party who pays for your education does so in return for some share of future income. Besides the obvious effect of funding profitable education, this has the totally awesome side-effect of giving great incentive to an organization to figure out exactly how much each person's income will be increased by each job - which includes predicting salary, probability of graduating, future macro trends, etc.

The third party wouldn't have much incentive to predict what jobs will be most fun (only whether you will hate it so much you quit), but at least a big chunk of the problem would be solved. Personally I think the solution would involve "higher education is rarely worth it", and direct people towards vocational training or just getting a damn job. But I could be wrong - the great thing about a mechanism is that I don't have to be right about the results to know that it would make things more efficient :).

Comment author: RobinHanson 10 September 2010 10:05:59PM 3 points [-]

I suspect one reason for this is that many people hope to steer their kids into good career deals they understand especially well. Official competent training in who should pursue which careers threatens to eliminate this advantage.

Comment author: sark 12 September 2010 05:47:24AM 1 point [-]

Why won't parents trust the recommendations of official competent training, if it had a good track record?

Comment author: Baughn 24 September 2010 12:17:43PM 2 points [-]

Being a public good, it's quite likely to be biased towards what's good for society more than what's good for the individual kid. More so than the parents' advise would be, at any rate.

On the other hand, I'd expect to see them happy to have other people's kids steered in this manner.

Comment author: Kingreaper 14 December 2010 06:57:35PM 1 point [-]

If you know that a career is underfilled and overpaid, you can get your kid a job there.

If there's official competent training, then more people will be directed at the job, and the pay:effort disparity will disappear.

Comment author: sark 14 December 2010 07:17:47PM 0 points [-]

So parents can potentially do better. In the cases where the jobs they understand well are not underfilled or overpaid, shouldn't they trust official recommendations? Possibly parents could know enough relatives/friends to hear of at least one underfilled/overpaid job to make such cases rare.

Comment author: JulianMorrison 10 September 2010 02:16:39PM 8 points [-]

What we really need is a "brain plasticity in adulthood" pill. Because really the only reason we force these impossible choices on teens is that we're racing against their diminishing ability to learn.

Comment author: sixes_and_sevens 10 September 2010 02:51:16PM 15 points [-]

This argument may hold for things like languages or thinking habits, or other skills that take root early, but having tackled an undergrad maths syllabus at both ages 18 and 28, I've found an adult work ethic beats the pants off youthful 'plasticity' any day of the week. Any skillset mandatory to a specialised vocation will probably mostly be learned well into adulthood anyway.

Comment author: Spurlock 10 September 2010 02:57:56PM 7 points [-]

Why can't we have both? A plasticity pill wouldn't inherently destroy your work ethic. Having this useful ability, without the crippling shortcomings of youth (mostly various forms of inexperience, not to mention developmental/hormonal distractions) would be one hell of a combination.

Comment author: sixes_and_sevens 10 September 2010 03:34:27PM 8 points [-]

Well, at the moment we can't have both because brain plasticity pills don't currently exist. If someone asked me tomorrow to optimise the education system, "educate people at the point in their lives when that education would be most useful to them" would come considerably higher up the list than "invent brain plasticity pill".

Comment author: JulianMorrison 10 September 2010 03:49:12PM 0 points [-]

The win from skilled use of childhood plasticity maxes out at around 15 well-filled years of highly plastic learning. The win from a pill maxes out at a lifetime thereof. So if a pill were close to technologically plausible, it would be a much better use of effort.

Comment author: sixes_and_sevens 10 September 2010 04:22:04PM 3 points [-]

Assuming it's possible to get the 'plasticity' gains without a significant trade-off. Childhood brains are so flexible because they're still developing; concordantly they don't have a fully developed set of cognitive skills.

By way of analogy, concrete is very flexible in its infancy and very rigid in its adulthood. The usefulness it possesses when rigid is based on how well its flexibility is utilised early on. If you come up with a method to fine-tune the superstructure of a building on the fly later on in its lifetime, cool beans. If all you come up with is a way to revert the whole thing to unset concrete, I'd rather focus on getting the building right first time.

Comment author: JulianMorrison 10 September 2010 04:43:24PM 3 points [-]

Childhood brains are so flexible because they're still developing

Hmm. I don't trust that. It sounds too much like a just-so story.

What I know is that most species have a learning-filled childhood followed by an adulthood with little to learn.

I also know that evolution hates waste - it will turn a feature off if it isn't used. So if anything the relatively high human ability to learn in adulthood looks to me like neoteny.

Concrete is a poor analogy - rigidity is not an advantage to adult humans!

Comment author: b1shop 11 September 2010 12:05:32AM *  1 point [-]

I think rigidity fits well into the Aristotelian framework.

Too rigid and you hold fast to wrong ideas. Too plastic and you waste mental effort challenging truths that should have been established.

Yes, we don't want to be too rigid in our beliefs, but there's a high opportunity cost to mental thought. I've run into too many hippies who are "open-minded" about whether or not 1=1. We have to internalize some beliefs as true to focus on other things.

I worry some in this community are so used to getting others to reconsider false beliefs they forget there's sometimes a good reason to sometimes have rigid beliefs. Reversed stupidity is not intelligence.

Comment author: army1987 22 July 2012 11:16:18AM *  2 points [-]

By the way, if I recall correctly, in the proverb "a rolling stone gathers no moss", moss was originally intended to be a good thing, but most people now take it to be a bad thing.

Comment author: sixes_and_sevens 10 September 2010 06:57:08PM -2 points [-]

In what way does it sound like a just-so story?

Re: rigidity and humans, I suspect you would find it very difficult if you continued to adjust your speech patterns to accomodate every irregular use of the English language you'd heard since the day you were born. Your ability to rapidly learn language stopped for a reason. In that sense, rigidity is pretty advantageous.

Comment author: komponisto 11 September 2010 04:24:23AM *  11 points [-]

I suspect you would find it very difficult if you continued to adjust your speech patterns to accomodate every irregular use of the English language you'd heard since the day you were born. Your ability to rapidly learn language stopped for a reason.

I'm tempted to call this a just-not-so story.

Not only do I disagree with the general point (about "rigidity" being advantageous), but my sense is that language is probably one of the worst examples you could have used to support this position.

It strikes me as wrong on at least 4 different levels, which I shall list in increasing order of importance:

(1) I don't think it would be particularly difficult at all. (I.e. I see no advantage in the loss of linguistic ability.)

(2) People probably do continue to adjust their speech patterns throughout their lives.

(3) Children do not "accommodate every irregular use [they have] heard since the day [they] were born". Instead, their language use develops according to systematic rules.

(4) There is a strong prior against the loss of an ability being an adaptation -- by default, a better explanation is that there was insufficient selection pressure for the ability to be maintained (since abilities are usually costly).

So, unless you're basing this on large amounts of data that I don't know about, I feel obliged to wag my finger here.

Comment author: TheOtherDave 14 December 2010 06:03:49PM 2 points [-]

Come to think of it, beating the pants off youthful plasticity accounted for why I didn't do a lot of studying in college.

More seriously: yeah, IME the idea that 18-year-olds are more able to learn than 30-year-olds is mostly a socially constructed self-fulfilling prophecy.

Comment author: Kingreaper 14 December 2010 06:50:58PM *  4 points [-]

I often think that more of pre-adult education should be about teaching people how to put effort into things, and a good work ethic, rather than just facts.

Comment author: James_K 10 September 2010 09:09:44AM 15 points [-]

It's interesting to see what happens when videogames behave more like real life. For instance, in Oblivion (and Fallout 3), you can't just take things unless you're in the middle of nowhere. If someone sees you, they cry out "stop, thief!". Equally, attacking people who didn't attack you first in civilised areas will draw the guard or vigilantes down on your head, and most of the stuff you find lying around is worthless trash that isn't worth the effort to haul away and sell.

I remember how jarring it was when I first tried to take something in Oblivion, only for a bystander to call for the guard. And then I realised that this is how NPCs should react to casual theft.

Comment author: SilasBarta 10 September 2010 07:56:09PM 5 points [-]

Is that how people normally react in real life? I would think people tend to be apathetic bystanders, or might think you were picking up something of your own.

If someone creates a real life simulator where you can repeatedly practice your crimes and learn what the actual responses would be ... God help us all.

(I mean God in the secular sense.)

Comment author: Alexei 10 September 2010 09:56:55PM 4 points [-]

An interesting article on stealing bicycles here.

Comment author: James_K 11 September 2010 01:14:49AM 3 points [-]

In Oblivion, the settlements you are in are village-sized. They would be close-knit communities in which you are a stranger. Also, we're not talking about picking up something off a street (picking flowers or herbs for instance) was OK, because outdoor plants generally weren't flagged as owned. Things you might want to pick up were generally indoors and often within sight of the person who owned them.

Comment author: thomblake 10 September 2010 03:02:47PM 5 points [-]

Yes, this is following the tradition of the Ultima series, wherein Lord British originally introduced those sorts of mechanics specifically out of concern for the effects video games might have on the character and habits of the players.

Comment author: CronoDAS 10 September 2010 10:37:20PM 2 points [-]

The Kleptomaniac Hero is very common in video games. If the game lets you take it, you probably should - and you can take a lot of stuff from random people's houses and such, while the people who actually own it stand there doing nothing.

Comment author: derefr 11 September 2010 08:42:53AM *  4 points [-]

This is an example of Conservation of Detail, which is just another way to say that the contrapositive of your statement is true: if you don't need to take something in a game, then the designer won't have bothered to make it take-able (or even to include it.)

I always assume that there's all sorts of stuff lying around in an RPG house that you can't see, because your viewpoint character doesn't bother to take notice of it. It might just be because it's irrelevant, but it might also be for ethical reasons: your viewpoint character only "reports" things to you that his system of belief allows him to act upon.

Comment author: SilasBarta 10 September 2010 10:58:30PM 1 point [-]

I want to see a game where people react normally to this kind of thing ... maybe even have the police increase their watches for thieves as more burglaries happen.

Comment author: CronoDAS 11 September 2010 12:44:46AM 3 points [-]

In the original Baldur's Gate, you'd get in trouble if any NPC saw you stealing something. And, perhaps unfortunately, "any NPC" included cats and other animals.

Comment author: Konkvistador 10 September 2010 07:52:22PM 2 points [-]

This seems to depend of the kind of games one plays.

NPCs noticing theft was default for most of my gaming experience (first RPG I played was Gothic 2) I am disappointed by less.

Comment author: JoshuaZ 10 September 2010 01:23:25PM 10 points [-]

One thing this essay does not address is whether humans actually are likely to learn heuristics from playing videogames or whether a large enough fraction of the population plays videogames for this to be a real concern.

Let's briefly address that: There's a fair bit of evidence that much of "play" behavior over a wide variety of species is specifically to learn behavior and rules for actual life events. Thus for example, wolf cubs engage in mock fights which prepare them for more serious events. Some species of corvids (crows, ravens, jays, etc.) will actively play with the large predators in their area (pecking at their tails for example or dropping objects on their faces) which is an apparent attempt to learn about the general behavior of the predators which is primarily important because these species of corvids get much of their food from scavenging. It is likely that humans engage in play behavior in part for similar reasons. If so, there's a real danger of people learning bad heuristics from videogames.

What percentage of the population plays videogames? A quick Google search turns up various numbers which disagree but it seems that they vary from around a third to slightly over half. See for example here. Given that, this seems like a common enough issue to be worth discussing.

Comment author: RolfAndreassen 10 September 2010 09:46:16PM 3 points [-]

Is it obvious that a videogame is enough like the play a human child would do in the ancestral environment that it will activate the learning-by-play circuits? Our enjoyment does not imply that it is play in the sense our learning circuits recognise.

Comment author: datadataeverywhere 10 September 2010 10:14:45PM 6 points [-]

Play is about learning. Even games that we don't think of as teaching us anything are fundamentally tied into our learning circuits. Even games as mindless as solitaire (Klondike) activate our learning circuitry regardless of whether or not we actually develop any skills by playing them---like an artificial neural network continuing to train past the point of usefulness, fluctuating around its plateau with every new example it sees.

One of the most difficult aspects of video game design is scheduling difficulty increases; ideally, a game gets harder at the same pace that the gamer gets better, because getting better feels good. Engaging those learning circuits is one of the primary reasons games are fun in the first place.

The real question to ask is whether learning bad habits in video games translates even a little bit to real life. This is an age-old debate, most frequently brought up when somebody claims that playing violent first-person shooters turns innocent children into mass-murdering psychopaths.

Comment author: patrissimo 12 September 2010 04:51:13AM 4 points [-]

But we learn by working too - especially if work is somewhat playful. Yes videogames teach us, but they teach us while producing absolutely nothing, whereas work teaches us, may be less fun, but actually produces value (makes the world better, you get paid rather than paying, etc.) And what you learn at work (or any productive enterprise - maybe it's an art project for Burning Man, or a self-improvement project) is much more likely to be what you need to know for future work. Whereas what you learn in a game may happen to be useful later, but also may not.

Playey Work >> Worky Play

Comment author: datadataeverywhere 12 September 2010 05:24:16AM *  8 points [-]

I agree with everything you said. We should be especially cautious about playing so-called casual games, where the modal player very quickly reaches a plateau beyond which he or she will learn nothing, much less learn anything useful.

The difference of course is that the learning process in Real Life is slooooow. In game X, after 30 hours of play, the modal player may be one or two orders of magnitude better at a given skill (one that is at least somewhat unique to the game) than someone who has been playing for two hours. Some games (e.g., some first-person shooters) require no unique skills; I suspect the skill curve looks similar, but most players are hundreds of hours along it rather than just a few, so the curve appears flatter and the differences proportionally smaller.

Contrast that to life: in mine, the skills I am trying to cultivate are the same ones that I've been trying to cultivate for years, and my improvement is sometimes so glacial that I feel doubt as to whether I'm getting better at all. I could just be thousands of hours along similarly shaped curves, but I have certainly reached the point where I no longer see incremental improvement: all I see anymore are occasional insights that discretely improve my abilities.

So, Playey Work is much better for us and for the world at large than Workey Play. The difficulty comes in valuing that value; truly internalizing that need to produce and be valued so that it overwhelms the additional fun that video games offer and will always be able to offer.

I earnestly want to know how to do that. If there's an optimization that could be applied to humans, I think that eliminating cognitive biases would do less to improve the human condition than increasing the weight we each internally attach to producing value for others. For just me, if I could do that, through meditation or cognitive behavior therapy or hypnosis or whatever, I would finish my dissertation right quick and just as quickly move on to doing the things that I know I want to do, but right now want less than to spend a dozen hours playing another video game.

Comment author: magfrump 10 September 2010 10:40:28AM 8 points [-]

As I was reading this, I realized that many of the points here apply heartily to single-player games, but the reverse is often true of MMOs.

A while back I spent a few years playing World of Warcraft, and ended up doing mid to high level raids.

When leveling, or completing a raid, you do know your purpose, and it is handed down from on high. This is unrealistic, but possibly one of the most relaxing aspects of escapism.

You DO NOT delay or take forever! While leveling or raiding, it is important to do things efficiently to meet your goals quickly. You want to hit max level ASAP, not see the whole low-level world; you want to see the whole high-level world.

When leveling or raiding, there is usually a specific build that is more powerful than the others. You have choices between various builds, but never more than 3 per character class, and usually the 3 are vastly different and you must choose one of them. For example, every rogue ever would take a talent that gives them +5 attack speed, but taking a bonus to speed while in stealth would get you kicked out of a hardcore guild.

In raiding, the difficulty isn't (strictly) progressive. Some fights are easier, some are harder. Some are more gear-dependent (i.e. harder until later in the game) but some of the fights in the highest-level raids were easier than some in the lowest level raids. It's also true that the difficulty mostly comes in learning how to do things the first time; after that it's easy to breeze through the content and most of the difficulty is in managing the guild.

Guides and patch notes are REQUIRED reading in most multiplayer games, because you have an obligation to support your team and defeat your opponents.

In terms of modeling the real world, MMOs are much better. On the other hand, they can be less fun. I've also experienced the multiplayer attitude leaking into other games (specifically D&D) and ruining the setup. I guess the lesson here may be that people are bad at changing heuristics when they change settings, in total agreement with the OP.

Comment author: sixes_and_sevens 10 September 2010 12:58:30PM 10 points [-]

I don't play a lot of video games, but I'm quite fond of strategy, and have recently become besotted with Starcraft 2. Something that struck me while looking through the online strategy community was how ruthlessly empirical that community was.

It shouldn't be too surprising. Players are provided with an environment governed by the immutable laws of the game engine, and the only objective is to win. You can accomplish this however you like: efficient micromanagement, complementary unit selection, economic superiority, stealth tactics, mind games, aggressive map dominance, off-the-wall strategy your opponent can't plan for...however you manage it, provided you're the one left standing at the end, you win.

As a result, players continually test the behaviour of the environment, see what works and throw away what doesn't. This often involves setting up contrived scenarios to explicitly test certain theories. The result is a massive body of knowledge on how to effectively win the game.

I would say that it's kind of heartening to find that when given proper incentive, even people with (presumably) no formal scientific training can apply systematic methods for testing the behaviour of their environment, but I don't know what kind of crossover exists between the SC/scientific community.

Comment author: NihilCredo 10 September 2010 07:26:09PM 6 points [-]

Here's a full research paper on the subject.

It very thoroughly data-mines a strategy WoW forum and observes how their discussion about the hidden variables of the game world mirrors quite clearly the scientific study of the natural world.

A: "I suggest that the Octopus Lord is vulnerable to fire." B: "Good idea! I have a fire sword and an identical ice sword, I'm off to try." C: "Wait! Maybe he's just resistant to ice! We need to design a better test."

It's actually a lot more complex than that (there's a HUGE spreadsheet quoted in the paper), but you get the idea.

Reverse engineering isn't quite the same as science (because you know from the start that all natural laws must be traceable back to a short piece of human-written code), but they are definitely kin.

Comment author: Zvi 12 September 2010 12:00:03AM 2 points [-]

Why is the goal handed down from on high? I don't think even this break from reality is true in an MMO.

If we mean that the game is telling you what to do, what you have are various NPC questgivers (employers) who are hiring heroes (players) for various jobs and offering various rewards. Then each group of players (guild, group, etc) decides together which of these jobs they want to accept. Alternatively, there are places you can go with things to be accomplished. This isn't that different from freelance work.

Even when there is a central overriding goal, you are still free to ignore it and set your own goals.

If we mean that the guild is handing down the goal from on high, well, that's highly realistic: Your boss is telling his workers what to do. You don't like it, choose new leadership or quit.

Comment author: magfrump 12 September 2010 08:05:13AM 2 points [-]

I mean that your goals are extremely concrete, and their value is extremely concrete. "Kill this many boars and you will gain this many experience points." "Earn this many experience points and you will gain a level."

My conception of the real world is that goals tend to be vague ("figure out and fulfill my own utility function") and subgoals tend to be unpredictable (will keeping a diary help? A food diary? research on the internet? Spending time with friends? What balance between "figure out" and "fulfill"?)

It is true that the system is MORE liquid than in most single player RPGs, where it is not uncommon to encounter a narrator saying something like "monsters are everywhere! Our hero sets out to defeat them all!" Which is on a bit of a different level.

Comment author: hegemonicon 10 September 2010 12:54:04PM 21 points [-]

Here's one that's particularly sinister, and shows up in nearly every RPG and MMORPG:

Progress is tied to presence, not performance.

In these games, as long as you're there, in front of the screen, you're making progress. Skill is almost never involved in success - if you can't beat that boss you just need to go kill a few thousand boars and level up a little bit, and if you want that sweet equipment you just have to put in the hours to grind for gold, or faction, or honor, or whatever.

In the real world, getting better at something generally takes actual work, and only occurs under specific conditions of deliberate practice and proper feedback. But it's so easy to fall into the trap of "hey, I'm doing something tangentally related to goal x or skill z, I must be making progress at it".

Comment author: CronoDAS 10 September 2010 10:10:15PM 11 points [-]

Well, that's not entirely unrealistic. As Woody Allen said, half of life is just showing up. (Ask Eliezer what he thinks about school...)

Comment author: listic 10 September 2010 01:20:19PM *  1 point [-]

There should be a reason skill is almost never involved in success.

In my understanding, this reason is network latency. I think you need low latency to make an action game where achievement is dependent on skill.

In World of Warcraft, you can have slow players on slow network connections separated by large distance from the server make progress and have fun. In 3D shooters, you can't.

Comment author: NihilCredo 10 September 2010 04:40:56PM 7 points [-]

Blizzard's own Starcraft is competitive and very fast-paced, and yet it has continent-wide servers all the same.

A better reason for the perverse nature of MMOs is that the promise of guaranteed progress, especially combined with social obligations, is much more effective at keeping people paying their monthly fees than the hope of personal improvement.

Comment author: hegemonicon 10 September 2010 04:57:19PM 4 points [-]

You see it slowly being integrated with other, more skill-based genres in the form of Achievements, little badges you can display and a progress bar/counter that marks how many you've gotten. Many of these are skill based, but just as many are presence based (ie: complete 1000 multiplayer matches).

Their widespread adoption into nearly every sort of game leads me to believe they're VERY effective for keeping people around.

Comment author: DanArmak 10 September 2010 08:22:19PM 1 point [-]

There should be a reason skill is almost never involved in success.

In my understanding, this reason is network latency. I think you need low latency to make an action game where achievement is dependent on skill.

Skill isn't particularly related to success in most single-player games that allow leveling/improving equipment. The developers want to please their paying customers, so they will do their best to prevent a situation where someone isn't skilled enough to complete the game. Since there are usually only a few game endings, everyone gets to see the same result, and so their playing skills don't ultimately matter. Adjustable game difficulty serves the same end.

Sure, some games are hard enough that not everyone can beat them, but these are the exceptions and they can even become famous for that quality. (Anecdotally, I remember reading claims that Japanese games are much more likely to be unbeatably difficult than are Western ones.)

Comment author: DSimon 10 September 2010 08:36:21PM 1 point [-]

I remember reading claims that Japanese games are much more likely to be unbeatably difficult than are Western ones

Hm, was that judged over the number of games made or the number of game copies sold? Or to put it anther way, did it show that Japanese developers like making hard games or that Japanese gamers like playing hard games?

Comment author: DanArmak 10 September 2010 09:00:57PM 0 points [-]

As I said, it's completely anecdotal - I don't remember the source, but it was someone commenting from his own (extensive) experience, not a controlled study. That said, I expect the comparison was between percentages of well-selling games.

Comment author: Relsqui 14 September 2010 10:25:19AM 0 points [-]

I think you need low latency to make an action game where achievement is dependent on skill.

It doesn't have to be an action game to be dependent on skill. Consider Puzzle Pirates. Almost everything you can accomplish is skill-based, and most of it's even single-player (but cooperative by way of many people puzzling towards the same goal). Avoids most issues with latency (as do, I imagine, the relatively simple graphics), and ties advancement to skill.

Comment author: STL 11 September 2010 12:25:43PM 7 points [-]

Hello, player character, and welcome to the Mazes of Menace!

I'm surprised that you didn't mention NetHack, and that nobody else has either, given that it contains the Mazes of Menace and provides counterexamples to many of your points.

Because your goal is not to reach the end quickly

In NetHack, the goal of beginners is to ascend, i.e. win, and it is very difficult. (I have not yet ascended; the furthest I've gotten is level 27, with 422434 points.) The goal of intermediate players is to ascend quickly. And the goal of advanced players is to ascend under the most ridiculously severe restrictions possible, called conducts. For example, not engaging in genocide is a conduct, and it makes the game harder (because now you can't genocide those nasty electric eels or master mind flayers).

Beginners may go slowly in order to methodically clean out the higher dungeon levels before heading into the lower ones (higher = nearer to your starting point at the surface = easier), but that's because they're trying to not die, and in NetHack character death is permanent.

For most games, there's a guide that explains exactly how to complete your objective perfectly, but to read it would be cheating.

In NetHack, reading spoilers is almost mandatory - in fact, unspoiled ascension is the rarest possible kind. NetHack's universe contains many systems that can be taken advantage of, but only if you know how - using pets to steal from shops being one example. (Shopkeepers hate direct theft and are ridiculously powerful, but pets can pick up and drop items, and shopkeepers don't mind that. By feeding your pets, you can induce them to drop items near you. This leads to the ability to convert tripe rations - which pets eat - into shop theft, giving you access to lots of items for free.)

This is because in games, the classes, skills, races and alignments are meant to be balanced, so they're all close to equally good.

NetHack's roles, races, and alignments have unbalanced advantages and disadvantages. For example, the Valkyrie is much, much easier to play than the Tourist. Only gender is balanced, as it has very few effects on the game.

Reality is the opposite; most of the difficulty comes up front, and it gets easier as you learn.

NetHack appears to follow reality here. Learning how to avoid death is front-loaded - see Yet Another Stupid Death, e.g. "genociding oneself", and Lessons Learned The Hard Way, e.g. "Don't stand on ice when there are foes with fire attack around."

Comment author: [deleted] 11 September 2010 06:42:05PM 4 points [-]

I started writing a reply about how different aspects of NetHack are repeated in other games, which lets those games avoid these hazards too...

...But then I realized that there was a common factor in everything I was writing, which is that as games become harder / more complicated to play, playing them becomes more and more similar to how we act in reality. NetHack is a very complicated game, and it is also very hard (for one thing, due to permanent death). So we use our full "reality skills" when playing it.

Comment author: Spurlock 10 September 2010 03:28:38AM 7 points [-]

I would like to commend you for taking the time to include the penultimate paragraph. I think it extremely worth pointing out that not everything that happens in games is likely to manifest in seemingly-analogous real world decisions.

The good news about most of these biases is that they are quite testable. I would love to see some research about the decision making processes of video game enthusiasts (particularly those who started at an early age) and a control group.

Comment author: Eliezer_Yudkowsky 10 September 2010 03:49:15AM 48 points [-]

When I get insignificant amounts of change, like a nickel, I leave it on the nearest outdoor object, thus teaching people that they can collect small amounts of money by searching random objects.

Comment author: Zvi 11 September 2010 11:36:57PM 8 points [-]

Whenever anyone comes up to me but doesn't say anything, I repeat the same phrase over and over again.

Comment author: CronoDAS 12 September 2010 04:19:35AM *  4 points [-]

How many NPCs does it take to change a lightbulb? (The answer)

Comment author: ata 12 September 2010 05:26:58AM *  1 point [-]

"Excuse me, sir..."
"Hi! I like shorts! They're comfy and easy to wear!"
"Uh, okay, anyway, would you happen to know where..."
"Hi! I like shorts! They're comfy and easy to wear!"
"..."
"Hi! I like shorts! They're comfy and easy to wear!"

Comment author: Baughn 10 September 2010 10:48:03AM *  6 points [-]

So, wait. This means the entire premise of this article is wrong?

Our intuitions aren't being driven off track, we're simply turning reality into a videogame.

Comment author: DSimon 10 September 2010 07:38:12PM 3 points [-]

Life imitates Zelda.

Comment author: MichaelVassar 10 September 2010 04:26:50PM 3 points [-]

All this before we even mention Burning Man?

Comment author: Drahflow 10 September 2010 05:23:50PM 5 points [-]

Video game authors probably put a lot of effort into optimizing video games for human pleasure.

Workplace design, User Interfaces etc., they could all be improved if more ideas were copied from video games.

Comment author: mattnewport 10 September 2010 05:36:17PM *  13 points [-]

Games often fall into the trap of optimizing for addictiveness which is not quite the same thing as pleasure. Jonathan Blow has talked about this and I think there is a lot of merit in his arguments:

He clarified, "I’m not saying [rewards are] bad, I’m saying you can divide them into two categories – some are like foods that are naturally beneficial and can increase your life, but some are like drugs."

Continued Blow, "As game designers, we don’t know how to make food, so we resort to drugs all the time. It shows in the discontent at the state of games – Radosh wanted food, but Halo 3 was just giving him cheap drugs."

...

Blow believes that according to WoW, the game's rules are its meaning of life. "The meaning of life in WoW is you’re some schmo that doesn’t have anything better to do than sit around pressing a button and killing imaginary monsters," he explained. "It doesn’t matter if you’re smart or how adept you are, it’s just how much time you sink in. You don’t need to do anything exceptional, you just need to run the treadmill like everyone else."

I work in the games industry and I see this pattern at work a lot from many designers.

Comment author: luminosity 11 September 2010 09:32:33AM 3 points [-]

Interestingly, some of the best mathematical analysis I've ever seen happens in WoW, and to a limited extent in other MMOs. When you want to be the top 25, 100 or even 1000 out of 13 million, you need to squeeze out every advantage you can. Often the people testing game mechanics have a better understanding of them than the game designers. Similarly, the first people to defeat new bosses do so because they have a group of people they can depend upon, but also because they have several people capable of analysing boss abilities, and iterating through different stategies until they find one that works.

It's unfortunate that there's so much sharing in the community; players who aren't striving to be the first to finish a fight can just obtain strategies from other people. People who don't care to analyse gameplay changes, or new items, can rely upon those who do to tell them what to wear, what abilities to choose, and what order to use them in. Back when I played one of my biggest frustrations was that nearly everybody in the game out of the few top thousand simply lack the ability to react and strategise on the fly. Throw an unexpect situation at them and maybe 1 in 10 will cope with it.

Comment author: msironen 11 September 2010 11:46:51AM 6 points [-]

Seconded. It seems to be a rather unfortunate video game meme in itself that MMOs (WoW particularly since it somewhat defines the genre currently) Massively reward time spent over skill. No amount of grinding low level content will make you capable of taking down say the Lich King in heroic mode (both skill- and gearwise) and to claim otherwise just shows that the extent of the knowledge of the person making the claim is limited to a single South Park episode. The most "celebrated" players are exactly the people who master the most difficult content first without the benefit of shared tactics (and usually with the highest gear handicap), not the the first guy who kills a billion sewer rats (even if there were such an Achievement).

It has been said by WoW developers responsible for generating new high-difficulty content that most of the challenge come from the fact that the best players are much, MUCH better than the average player (even more so that actual player community is aware of) that making content which is not trivial to the top guilds but is also beatable for the average joe has become somewhat impossible without certain gimmicks. Certainly, you can become say the richest player on your server just by investing massive amounts of time (though actually manipulating the Auction House seems nowadays a better strategy than grinding), but that just means that you'll be known as the guy who spent the most time gaming the AH (we actually have just such a player on our realm). If anyone think that's the game rewarding you for time spent instead of skill, I seriously suggest they spent a little more time researching the subject before pontificating on it.

Finally I apologize for the slightly combative tone of my first post, but I hope it's an excusable reaction, especially on this site, to a nearly "accepted wisdom" that doesn't really even survive the slightest scrutiny.

Comment author: yrff_jebat 17 September 2010 07:41:42PM *  0 points [-]

(Not meant as a rhetoric question): Does "mathematical analysis" really mean that someone with an IQ of 170 has (in average) a real advantage to someone with an IQ of 160 (if you don't count effects on information processing ability and reaction time) in solving really hard mathematical problems, or is it rather a combination of clicking fast, knowing how the monsters will react and calcing through what will happen if you do X?

Comment author: patrissimo 12 September 2010 04:46:59AM 8 points [-]

And that great mathematical analysis is being directed at solving meaningless made-up problems that generate no value for the world. It's pure consumption, zero production. Yet it's complicated, goal-oriented consumption, it feels like doing work, and hence scratches the productive itch for many people...without actually doing any good in the world.

It's a powerful opiate (a drug which makes the time pass pleasantly and users wish to use all the time, as opposed to psychedelics which are used occasionally and make the rest of your life better). Which, I believe, makes it on the side of evil not of good.

Comment author: yrff_jebat 17 September 2010 07:22:19PM *  1 point [-]

At least the first part could be said word-by-word for modern-day astrophysics, except that this is socially accepted and the guys and gals doing it are (in most cases) being paid for (and even the people seeing fundamental knowledge over the universe as goal in itself will agree that there are far more important things to divert workforce to)

Comment author: yrff_jebat 27 October 2010 06:57:04PM 1 point [-]

I also find it funny when mathematicians pejoratively speak of "recreational mathematics" (problem solving) as opposed to theory building: "If I build a lego hat, that's just for fun, but if I build a lego Empire State Building, that's serious business!"

Comment author: luminosity 12 September 2010 02:17:52PM *  1 point [-]

I don't disagree much with your post (my only complaint is that fun is a reasonable goal in and of itself, and if someone chooses that, then so be it). However my objection is to Blow's (amongst many others') characterisation of the game and the players. Contrary to his thesis, being smart and adept are actually massively rewarded in WoW by comparison to other games; nearly everybody who plays the game is aware of the best players. There is a lot of status up for grabs just by being the best on a server, let alone best in the world.

Accepting his analysis at its face value would lead you to conclude that there are no lessons you can take from WoW or other MMOs. In fact, to me WoW demonstrates ways in which people can be motivated to work upon hard, mathematical problems. It would be a shame if people were to dismiss it off hand, when it has the potential to demonstrate how to structure hard work to make it more palatable and attractive to tackle.

Comment author: katydee 12 September 2010 03:21:35PM 1 point [-]

I know several people who used to be the best players on a particular WoW server-- they said it was generally boring and not really as prestigious as one might expect, since the sheer number of servers out there means that being the best on one doesn't even necessarily mean you're all that good at the game as a whole.

Comment author: luminosity 13 September 2010 01:19:21AM 1 point [-]

I suppose it would depend on the makeup of your particular server. Though we were nowhere near world best, my guild had decent competition on our server and there was always need to strive to be the first to win an encounter. Both groups were reasonably well known on the server, and I would reasonably often have people messaging me out of the blue.

To try to generalise the post a bit better, I think the lesson from this is that to encourage rational analysis and quick thinking in important areas it's important to have good competition, an easily verified criteria for 'winning', preferably milestones towards the ultimate goal, and a reward for winning whether status or monetary. Off the top of my head, the people behind the X-Prizes seem to have used this model well to encourage innovation in select areas.

Comment author: Baughn 10 September 2010 07:34:25PM 0 points [-]

Where do visual novels such as Ever17 fit on this scale? Do you count them as games at all?

Comment author: mattnewport 10 September 2010 08:32:18PM *  1 point [-]

I'd never heard of Every17. Based on the description on Wikipedia I'd say it's borderline whether it qualifies as a game. I'm not sure it meets the minimum level of interactivity required. Non-game entertainment can fall into the same trap of addictiveness vs. pleasure however, some TV for example.

Comment author: ata 10 September 2010 03:15:10PM *  18 points [-]

I paint question marks on boxes and leave hallucinogenic mushrooms in them.

Comment author: Spurlock 10 September 2010 03:48:00PM 29 points [-]

I just leave handgun ammunition everywhere.

Comment author: SilasBarta 10 September 2010 03:31:23PM 21 points [-]

For my part, I force rabid dogs to swallow gold coins. Defeating an aggressive dog ought to earn you both experience and precious metals.

Comment author: NihilCredo 10 September 2010 03:29:46PM 22 points [-]

I randomly assault people who wander outside of the city centre, but only if they look strong enough to kill me easily.

Comment author: thomblake 10 September 2010 02:57:33PM 11 points [-]

I do that too. Also, I had a sign by my front door that read "no plot here" so that wandering adventurers aren't tempted to investigate too deeply and make a mess of my house.

I once decided to search my sofa and found a pair of nunchaku there.

Comment author: prase 10 September 2010 12:22:20PM 4 points [-]

Thank FSM that you don't leave weapons there.

Comment author: spencerth 10 September 2010 07:58:06PM 6 points [-]

Good post. One other thing that should be said has to do with the /why/. Why do we design many games like this? There are some obvious reasons: it's easier, it's fun, it plays on our natural reward mechanisms, etc. A perhaps less obvious one: it reflects the world as many /wish it could be/. Straightforward; full of definite, predefined goals; having well known, well understood challenges; having predictable rewards that are trivial to compare to others; having a very linear path for "progression" (via leveling up, attribute increases, etc.) A world with a WHOLE lot less variables.

Comment author: luminosity 11 September 2010 09:48:09AM 4 points [-]

If you're not aware of Jane McGonigal you might be interested in her works. Her basic position is that games are better than reality, mostly because they have a far superior feedback system. She tries to apply game design to the real world to stimulate people's problem solving.

Comment author: PhilGoetz 10 September 2010 07:03:54PM 6 points [-]

Back around 1990, there was a school of game design that said that a game should be immersive, and to be immersive, it should stop reminding you that it's a game by making you throw away all real-life conventions. So this school of game design said things like:

  • You should not have to examine everything in the game. You should do just fine in the game by examining only objects that a reasonable person would examine.

  • You should not have to die in order to learn something needed for the game.

  • You should usually be punished for theft, breaking and entering, and other crimes.

  • You should not need to pick up lots of junk and carry it around because it's going to be needed in the endgame for reasons no one could have foreseen.

Unfortunately, this school of game design died. It was only ever really popular with non-commercial text-adventure designers.

Comment author: NancyLebovitz 11 September 2010 12:28:04PM 4 points [-]

Exactly-- and I don't think it's just structural. A lifestyle of killing sentients and taking their stuff might or might not be a pleasure in the real world, but it seems to satisfy the imagination.

Women's Work: The First 20,000 Years Women, Cloth, and Society in Early Times is about what can be deduced about the early tech whose products don't survive for millennia.

Even then, people were looking to fill time as well as to use it.

You'd think people would evolve towards maximum-reproduction utilitarianism, but I'm not seeing it happen.

Comment author: luminosity 11 September 2010 09:24:46AM *  4 points [-]

Deus Ex is the last good example I can think of, of a game immersive in this sense. Depending on how the prequel goes, it might not be dead just yet.

Edit: As pointed out downthread, there are of course Bethesda's RPGs too.

Comment author: DanielLC 10 September 2010 04:14:48AM 6 points [-]

In the real world, getting rid of junk costs money in effort and disposal fees instead.

In the real world, you can sell your old stuff. People just don't. Perhaps games can teach them that it is a good idea, even if it's for a fraction of the price you bought it for.

Comment author: gwern 10 September 2010 04:41:08PM *  3 points [-]

Well, sometimes you can sell them. I'm having trouble unloading my GeForce 8600 on Craigslist for $20, which I thought was a pretty low price. And nobody has been interested in my 24-inch TV, even at a nominal $15.

EDIT: I managed to sell the graphics card, but got not a single expression of interest in the TV even after dropping it down to $4, at which point I gave up.

Comment author: Zvi 12 September 2010 12:09:09AM 2 points [-]

I think games teach a valuable lesson the moment you realize that everything you buy has lost three quarters of its value when you try to turn around and sell it.

They also teach a valuable lesson when you realize you have a limited amount of inventory space and that you're going to have to get rid of most of your junk.

Video games do teach us to sell our junk when we can rather than throw it away, however, and I strongly feel that in general far too much time is spent trying to sell or even give away things we no longer have a use for rather than throwing them away, and often the underlying reason is because throwing them away is wasteful and therefore wrong. My parents taught me this explicitly, and it was a hard lesson to unlearn.

Comment author: CronoDAS 12 September 2010 05:14:32AM 2 points [-]

Then there's also those video games that reward you for holding onto "useless" junk because you'll end up needing it later for some optional reward, even if the game lets you sell it for much-needed cash before then. LostForever can be one of the more annoying game tropes.

Comment author: mattnewport 10 September 2010 05:45:23PM *  1 point [-]

In the real world, you can sell your old stuff. People just don't.

You can sometimes sell your old stuff but for many people it's not worth it for most items - the return vs. the time investment isn't worth it vs. just throwing it out.

Even giving stuff away for free is generally too much effort to be worth it over throwing stuff out though you might think people who had a use for free stuff would have an interest in making it easier to give it to them than to drop it in the garbage.

Comment author: scotherns 16 September 2010 12:05:39PM 1 point [-]

Freecycle exists specifically to assist in giving things away.

Comment author: Konkvistador 10 September 2010 08:00:39PM 0 points [-]

http://www.listia.com/

Sites like the above do make this easier.

Comment author: mattnewport 10 September 2010 08:39:08PM *  0 points [-]

Listia seems like a really terrible idea to me - from what I can tell it's like a much smaller ebay where money is replaced with 'credits' which the company hopes to make money by selling to people. It's possible they might make a profitable business out of it but I see no benefit to the users other than the misguided idea that they're getting something for free.

Comment author: michaelcurzi 03 March 2011 09:56:13AM 5 points [-]

On the other hand, good videogames can be a cool tool for low-risk self-improvement.

I've historically had a lot of trouble focusing on one thing at a time - choosing a major, minimizing my areas of focus. I recently played KOTOR, and realized that I play videogames the exact same way. I can never commit to one class/alignment/weapon specialization at a time, and I suffer for it.

Recognizing the similarities, I decided to play KOTOR as a specialist in one alignment, one class, and one weapon type, ignoring tantalizing opportunities to generalize whenever possible. I ended up enjoying the game a lot more than I usually do.

Three weeks later I chose my major, and I honestly believe KOTOR helped.

Comment author: MBlume 11 September 2010 08:12:06AM 5 points [-]

I'm rather amused to be reading this for the first time while wearing my 'Things You Learn From Video Games' shirt...

Comment author: dclayh 10 September 2010 07:23:52AM 5 points [-]

Similarly, if you're on a quest to save the world, you do side-quests to put it off as long as possible

I've explicitly made note this fact, that one should do quests in exactly reverse order of importance, in every cRPG I've ever played. Because often making progress on major quests will change the game (lock you out of an area, say, or kill an NPC) such that you can no longer complete some minor quests if you haven't done them already .

Comment author: NihilCredo 10 September 2010 04:51:02PM *  8 points [-]

Modern designers have finally started to take account of this. In Mass Effect 2, you do almost all of your side-questing while you wait for your employer to gather information about the main problem. Once the party does get started, the game makes it emphatically clear that waiting any more than absolutely necessary is going to severely compromise your primary mission.

Comment author: dclayh 10 September 2010 06:35:49PM 1 point [-]

But does it actually punish you for waiting, or just threaten to? (I haven't gotten around to playing Mass Effect 2 yet.)

Comment author: NihilCredo 10 September 2010 07:34:11PM *  7 points [-]

jim answered quite thoroughly. I'll add that I was hinting mainly at the fact that the BioWare developers knew that most players would, by habit, take their sweet time no matter how many universes were at stake, and planned accordingly.

If your most trusted ally tells you "We must hurry, or we will fail!", a veteran gamer knows to ignore him and go rescue a kitten. If a pop-up window tells you to hurry up or you will fail, you do hurry up. Some messages can only be given on this side of the fourth wall.

Comment author: jimrandomh 10 September 2010 06:46:40PM 2 points [-]

Yes; if you're too slow, it kills off some minor characters who would otherwise survive. The ending to that game is quite well done. It also has you assign NPCs to tasks, and kills a character for each assignment you get wrong, including some non-obvious and unstated requirements, like you can't put someone in charge of a squad if their backstory doesn't mention leadership experience.

However, the early game still has the usual timing incentive problem. Side-quests fall into major and minor categories, and the clock doesn't start ticking until you've done all the major ones.

Comment author: thomblake 10 September 2010 03:01:18PM 5 points [-]

I have a friend, Rit, who refuses to play cRPGs this way. Towards the end of Final Fantasy 8 (don't expect spoilers ahead), you are supposed to do all your sidequests before rescuing a friend in trouble; by FF tradition, this should be obvious to the player since you just got free reign of the world map. Rit said, "Screw that, she's in trouble, I'm going straight there!"

Comment author: Konkvistador 10 September 2010 07:55:16PM 1 point [-]

The original Fallout is an exception since it had a time limit. The world changed as time went on, regardless if you did anything and if you where slow enough (500 in game days I think) you could loose the game.

Comment author: William 11 September 2010 12:55:06AM 2 points [-]

Star Control II did something very similar--as time went on, the world changed, and eventually one of the villains would start their omnicidal rampage.

Comment author: Zvi 12 September 2010 12:15:52AM 3 points [-]

And in both of these games I had to restart because you can use a huge amount of time traveling the world map to go places, and spending game time rather than playing time makes perfect sens, especially for the Luck 10 character I was playing, until you realize you've lost. Star Control 2 gives you fair warning and I didn't realize it at the time, but Fallout doesn't and I was pretty mad about it.

Having a time limit without being deeply explicit about it is a crime against gaming.

Comment author: CronoDAS 12 September 2010 05:16:48AM 1 point [-]

Having a time limit without being deeply explicit about it is a crime against gaming.

Seconded.

Comment author: Konkvistador 12 September 2010 06:52:23PM 1 point [-]

However getting a nasty surprise like that might just help shed light on a Video game meme you didn't even know you internalized.

Also Fallout was explicit about the time limit. The pipboy clock, as well as the manual.

Comment author: James_K 10 September 2010 07:24:31PM 1 point [-]

It's interesting that the designers hook up the formula for FF 13. You basically don't do any sidequests until you finish the game. After defeating the final bosses it puts you at the last save point, and lets you go back and do all those sidequests you walked past earlier in the game. The incentive to play this way comes form the fact you can't finish levelling up until you finish the game.

Comment author: jefallbright 13 September 2010 03:58:33PM 13 points [-]

The most insidious of these misguiding heuristics have, apparently due to their transparency (like water to a fish), gone unmentioned so far in this thread.

Typical game play shares much in common with typical schooling. Children are inculcated with impressions of a world of levels that can (and should) be ascended through mastery of skills corresponding to challenges presented to them at each level, with right action leading to convergence on right answers, within an effectively fixed and ultimately knowable context.

Contrast this with the "real world", where challenges are not presented but encountered, where it's generally better to do the right thing than to do things right, within a diverging context of increasing uncertainty.

Comment author: James_Miller 10 September 2010 01:16:14PM 4 points [-]

I bet video games make you a better driver by forcing you to develop situational awareness.

Comment author: Matt_Stein 10 September 2010 03:30:15PM 7 points [-]

In my experience, it was much easier to learn to drive thanks to my experience with videogames. After years of picking up new control systems, learning to drive an actual car was of little challenge. Same thing when I made the transition from automatic to manual transmission. It'd be interesting to see some research into how easily people pick up and learn new interfaces. I think it's also part of what separates "computer people" from "non-computer people".

(Sorry, bit of a tangent there)

Comment author: NihilCredo 10 September 2010 05:13:16PM 6 points [-]

I think it continously ingrains a certain type of "testing the waters" process:

1) Find an operation you can perform 2) Is it likely to cause permanent damage? If yes, goto 1. 3) Perform that operation a few times 4) See how it combines with the other operations you have already mastered 5) Repeat

I don't think it's something inherent to video games as a medium, it's just the most common activity that requires you to learn a new interface every few weeks if not more often. Professional tools of any kind will strive to retain a familiar feeling, and everyday tools like household appliances, cars, or cellphones don't get replaced nearly as fast.

(Sorry, bit of a tangent there)

It's OK, this is rapidly turning into LW's General Videogame Thread anyway.

Comment author: PeerInfinity 10 September 2010 05:43:36PM 7 points [-]

This thread wouldn't be complete without a link to this Ctrl+Alt+Del comic

Comment author: CronoDAS 10 September 2010 10:05:47PM *  4 points [-]
Comment author: scotherns 16 September 2010 11:24:29AM 3 points [-]

'Chore Wars' (http://www.chorewars.com/) is designed to motivate you do get chores done by providing XP / Gold / Treasure for completing chores, and tracking it to induce competition amongst your housemates.

It works for me as a more interesting to-do list, and has caused my kids to argue about who gets to clean the toilet and level up.

Comment author: NancyLebovitz 11 September 2010 12:16:35PM *  3 points [-]

Cooks Illustrated might be a real-world example-- they take recipes through a bunch of conscious variations to perfect them.

Comment author: thomblake 10 September 2010 06:47:53PM 3 points [-]

I'd never thought of grinding real-life skills - brilliant!

Comment author: CronoDAS 10 September 2010 10:06:53PM 7 points [-]

Learning to play musical instruments is basically grinding.

Comment author: NancyLebovitz 11 September 2010 12:19:51PM *  7 points [-]

Getting good isn't-- see Talent Is Overrated for details about the 10,000 hours to mastery theory.

People tend to prefer grinding over developing relevant sub-skills by experimentation, but the latter is what works.

Comment author: CronoDAS 11 September 2010 05:45:45PM 3 points [-]

I think we need to decompose what we mean by "grinding". When I practice a segment of a song on the piano over and over until my hands move the proper way to hit the notes, that's grinding, right?

Comment author: jimrandomh 11 September 2010 05:49:30PM 6 points [-]

I think grinding would be if you kept on practicing the song even after you could consistently play it correctly. Otherwise, the positive connotations of "practice" versus the negative connotations of "grinding" wouldn't make sense.

Comment author: NancyLebovitz 11 September 2010 06:01:11PM *  2 points [-]

Maybe "grinding" isn't the right word, but playing something over and over until it smooths out is the way most people practice.

Thinking about what might be causing problems or what might lead to improvement, and then working on the piece to make specific changes is what you need to do to get really excellent.

Comment author: PeerInfinity 17 September 2010 09:12:31PM 5 points [-]

I guess I might as well post about my own experiences, even though I'm probably not a typical game player:

I noticed myself developing the habit of seeking the dead ends first in video games, but I thought that it was just a bad habit that I developed, and that most other people don't play like that. My brother doesn't play like that. But I continue using this strategy even in games where there isn't a reward at the dead ends. I deliberately choose the path that's more likely to be a dead end first, just for my own peace of mind, to know that it's a dead end, and I don't have to mentally keep track of another decision branch that I might have to come back and try later. And I also apply this strategy in real life, in any situation that involves a branching set of decisions, where if I take the wrong branch, I'll need to go back to a previous branch, and there is some cost involved in backtracking, even if the cost is just a trivial amount of time or mental bookkeeping. And I've found that this is actually a good and helpful strategy. Or maybe it just feels helpful, despite often being counterproductive. If, for example, you're trying to track down a bug in your source code, and... actually, rather than trying to explain in words, I'll just link to this XKCD comic, which illustrates what happens you follow the opposite of this strategy.

I'm not afraid of spoilers. And I have the bad (or maybe not so bad?) habit of always playing to win, as opposed to playing for fun. If there is a guide for the game, I'll read it. Unless reading the guide would actually be more work than figuring things out for myself, or if reading the guide would prevent me from learning or practicing a skill that's actually important. Guides can save you from having to do lots of pointless searching, or experimenting, and can prevent you from making wrong choices early in a game that have big harmful consequences later in a game. And they can contain other important information that you wouldn't have had much chance of figuring out on your own. Also, I often find reading the guide to be more fun than actually playing the game.

I apply the same paranoia to games that I apply to real life. I err on the side of spending too much effort researching which choice to make, rather than risking making the wrong choice by deciding arbitrarily on a whim. And this is often necessary even in video games, due to imbalanced classes, or classes that just don't fit my playing style.

Another example of this paranoia: I'm constantly expecting that at any moment, the game might contain a challenge that's impossible, or that is only possible if I haven't made any other mistakes or wasted any rare items until I reached that challenge. And this is sometimes necessary even in video games.

I also apply the same frugality in video games that I apply in real life, even if I know that the game's currency inflates dramatically as the game continues, making this strategy counterproductive. I just can't bring myself to act in any other way. I avoid buying any items I probably don't need. I avoid using up nonrenewable items, often to the point where it's useless to keep them in my inventory because I never use them. And yet I still try to collect as many of these nonrenewable items as possible, even though I know I'm probably never going to use them. The problem with this strategy was illustrated in this 8-bit-theater comic. Specifically, the quote "I merely realized that my reluctance to sacrifice spell slots or use items for the express purpose of maintaining my peak level of versatility was a vicious cycle of stagnation."

Actually, this strategy sometimes causes problems in real life. Usually with food going bad because I was reluctant to use it up... just because of some irrational inhibition about using up any resource that can't be easily replaced.

I really hate playing "deathmatch mode" in FPS games, because the usual rules about trying to survive at all costs no longer apply, because you're competing for the highest number of kills, not trying to be the last person standing. After years of trying, I still can't get used to playing like that. And so I've mostly stopped trying.

I suspect that video games have made me more risk averse than I would have been otherwise, by constantly providing me with examples of ways that things can go terribly wrong. I'm still undecided about whether this has an overall positive effect. For example, when I'm driving, it makes me constantly be on the lookout for objects that could potentially move into my path or otherwise collide with my car. But sometimes it makes me panic, with dangerous results.

I guess I had better stop writing now, this comment has already grown too long.

Comment author: mattnewport 17 September 2010 09:47:53PM *  2 points [-]

I also apply the same frugality in video games that I apply in real life, even if I know that the game's currency inflates dramatically as the game continues, making this strategy counterproductive. I just can't bring myself to act in any other way. I avoid buying any items I probably don't need. I avoid using up nonrenewable items, often to the point where it's useless to keep them in my inventory because I never use them. And yet I still try to collect as many of these nonrenewable items as possible, even though I know I'm probably never going to use them.

I used to have some of the same tendencies when playing games but in an effort to improve my play (particularly playing competitive multiplayer games) learned that it's often a bad strategy. I feel learning this actually helped me overcome an unproductive real life tendency towards hoarding or excessive caution. I have a much reduced tendency to do this now.

A related habit which I unlearned to some extent from games (particularly competitive RTS games) was the tendency to try and build up impenetrable defenses before engaging in any combat (excessive turtling in RTS speak). This is another example of a tendency which can be ineffective or counterproductive in real life and I've found lessons from game strategy helpful in overcoming. This is I think a similar problem to how you describe your tendency to "err on the side of spending too much effort researching which choice to make, rather than risking making the wrong choice by deciding arbitrarily on a whim".

Note that in certain circumstances both of these tendencies can be good winning strategies. If you have a personality type that inclines you to overuse this type of strategy even when it is not a good approach it can be detrimental to your success. I personally found games helpful in appreciating this.

Comment author: PeerInfinity 17 September 2010 10:08:46PM 0 points [-]

hmm, I just realized that this confession that I deliberately use a strategy that inefficiently uses in-game currency... kinda conflicts with my previous claim that I always play to win.

a random thought: am I playing to win, or am I playing to "not lose"?

also, sometimes it turns out that I actually did need to save up the in-game currency for an important item in the next town, and so I shouldn't always just spend all the currency as soon as I get it, with the excuse that inflation makes frugality counterproductive.

I also have a tendency to turtle. If there's ever a choice between offense and defense, I choose defense. Or maybe higher speed, for better dodging. Or better yet, the ability to heal. I usually pick the class with the best healing ability. My overly defensive strategies kinda make me no fun to play against, but they generally result in me losing less often.

And yes, I have found games to be useful for showing me when my strategy is suboptimal, and I've been making some attempt to change the bad habits. Though I don't seem to have made much progress at this. I have at least allowed myself to go on a big spending spree when I'm at the last town, and the currency has stopped inflating. And I've allowed myself to use all those rare items in the battle with the final boss, since there's nothing left to save the items for... except maybe later in the battle with the final boss...

So I know that my strategy is suboptimal, and I'm trying to change it, but I'm failing to actually make any significant changes, due to... psychological inertia?

But I still make sure to buy stuff that is actually necessary, or that is obviously a good deal, and I actually do use items that are obviously a good idea to use. And my strategy does work well enough for me actually win often enough, so maybe I'm being too critical...

Comment author: NancyLebovitz 21 September 2010 01:10:27AM 0 points [-]

To what extent are you playing to fill time?

Comment author: PeerInfinity 21 September 2010 01:49:20PM 0 points [-]

When I start playing, it's because I don't feel like I have the energy to do something more useful, or if I just feel like I need a break. And so my original purpose for starting the game is just to fill time, and maybe even have fun, or recharge energy. But once I start playing, I almost always end up taking the game way too seriously, and I end up burning energy I didn't think I had, and ending up more tired when I'm finished than when I started. Once I start a game, I have a really hard time stopping. That's bad. And yet playing games is still my default activity, when I don't feel like doing anything else.

Comment author: mattnewport 17 September 2010 10:36:46PM 0 points [-]

hmm, I just realized that this confession that I deliberately use a strategy that inefficiently uses in-game currency... kinda conflicts with my previous claim that I always play to win.

This can be a manifestation of a lost purpose. Money / one-use items are useful to accumulate for the purposes of beating the game (or your opponent) but focusing on maximizing them is to lose sight of your goal (winning the game).

And my strategy does work well enough for me actually win often enough, so maybe I'm being too critical...

It's not clear to me whether you are primarily talking about single player games or not but I have generally found competitive multiplayer much more effective than single player for encouraging winning strategies and punishing losing strategies. Good human opponents often also devise creative strategies which can be educational in themselves.

Comment author: thomblake 17 September 2010 09:43:01PM 0 points [-]

Another example of this paranoia: I'm constantly expecting that at any moment, the game might contain a challenge that's impossible, or that is only possible if I haven't made any other mistakes or wasted any rare items until I reached that challenge. And this is sometimes necessary even in video games.

The reason I hate Final Fantasy Tactics.

The problem with this strategy was illustrated in this 8-bit-theater comic.

I've had the same problem. I basically came to an epiphany similar to Red Mage's. It applied to both my behavior in life and in RPGs.

Comment author: Vladimir_M 11 September 2010 05:56:21PM *  5 points [-]

I'd say the worst habit of thought promoted by computer games is that if you do something disastrously foolish or clumsy, you can conveniently restart from a recently saved position. Clearly, that doesn't help one develop a good attitude towards the possibility of blunders in real life. (Though I should add that I haven't played any computer games in almost a decade, and I don't know if the basic concepts have changed since then.)

Comment author: madair 11 September 2010 08:07:17PM 4 points [-]

I don't find that's so for myself. War games give me a sense of my own mortality and ease of finding death no matter how many health packs and energy shields are provided. I wonder whether others experience the same?

Comment author: Zvi 12 September 2010 12:17:36AM 5 points [-]

I find it makes me long for this ability rather than fool me into thinking I have it. In fact, reminding me that I could die or make the game unwinnable at any moment tends to have the opposite effect of making me more risk averse than I should be.

Comment author: jimrandomh 10 September 2010 02:25:08AM *  6 points [-]

Thanks to Ralith, Lark, neptunepink and nshepperd for their feedback on the first draft of this article in the #lesswrong IRC channel. The IRC channel is a good way to get early feedback on posts, and discussing it there revealed several important flaws in the writing that have been fixed.

Comment author: Clippy 10 September 2010 03:55:21PM 1 point [-]

Which internet has the #lesswrong IRC channel?

Comment author: jimrandomh 10 September 2010 03:58:56PM 1 point [-]

Less Wrong IRC channel details are on the wiki.

Comment author: NihilCredo 10 September 2010 04:41:59PM 0 points [-]

This one.

Comment author: Larks 10 September 2010 05:19:04AM *  1 point [-]

Yes - there normally seem to be a good number of people there too.

ETA: here is the channel in question.

Comment author: Alexei 10 September 2010 09:44:39PM 2 points [-]

Hmm, I've never really confused my life with in-game life before, but I wonder if I maybe do it on the subconscious level. An interesting note: when I tried playing Morrowind (for those who don't know, that game has a huge open world with many huge areas that are there for no reason other than to add realism), I had a sort of paralysis, because I had to explore every room and open every door, but that's simply impossible in that game.

Comment author: Sniffnoy 10 September 2010 02:34:08AM 2 points [-]

It should be noted that some of these seem specific to games with a levelling/upgrade system, and in particular ones that you don't know in advance / are not really intended for replay.

Comment author: ShardPhoenix 10 September 2010 04:32:56AM *  2 points [-]

If you think people believe that RPG classes are balanced, you obviously haven't spent much time reading game forums! "Imba"ness, real or perceived, is probably the #1 topic of discussion for most multiplayer games.

Comment author: jmmcd 11 September 2010 07:51:11AM 1 point [-]

Applying this reasoning to the real world would mean choosing a career without bothering to find out what sort of salary and lifestyle it supports; but things in the real world are almost never balanced in this sense. (Many people, in fact, do not do this research, which is why colleges turn out so many English majors.)

It would have been worth noting that there are valid criteria for choice of university courses other than these. As it is, this section looks rather philistine.