Have I ever remarked on how completely ridiculous it is to ask high school students to decide what they want to do with the rest of their lives and give them nearly no support in doing so?
Support like, say, spending a day apiece watching twenty different jobs and then another week at their top three choices, with salary charts and projections and probabilities of graduating that subject given their test scores? The more so considering this is a central allocation question for the entire economy?
At my high school the gifted program required a certain number of hours of internship at a company in the area, and indeed even those outside the gifted program were encouraged to meet with their counselors for advice on finding internships. 'Course, that program, along with AP classes, the arts, and half of science, was cut starting this school year. I think it's 'cuz Arizona realized that since they were already by far the worst state in the nation when it came to education they might as well heed the law of comparative advantage and allocate more resources to the harassment of cryonics institutions.
One possible solution is to have education financed by equity rather than loans, the third party who pays for your education does so in return for some share of future income. Besides the obvious effect of funding profitable education, this has the totally awesome side-effect of giving great incentive to an organization to figure out exactly how much each person's income will be increased by each job - which includes predicting salary, probability of graduating, future macro trends, etc.
The third party wouldn't have much incentive to predict what jobs will be most fun (only whether you will hate it so much you quit), but at least a big chunk of the problem would be solved. Personally I think the solution would involve "higher education is rarely worth it", and direct people towards vocational training or just getting a damn job. But I could be wrong - the great thing about a mechanism is that I don't have to be right about the results to know that it would make things more efficient :).
Indeed, some of us spend 9 more years in school to postpone this decision. (In case you were wondering, it doesn't help.)
Have I ever remarked on how completely ridiculous it is to ask high school students to decide what they want to do with the rest of their lives and give them nearly no support in doing so?
Do we actually do that that much? The vast majority of high school students when I was in highschool had no idea what they wanted to do, and that was considered ok. Heck, a large fraction of people even when they were well into their undergraduate educations didn't know what they wanted to do and that was also considered ok. And as far as I can tell the general trend in high school education has been less emphasis on specific-job oriented classes as time has gone on.
What we really need is a "brain plasticity in adulthood" pill. Because really the only reason we force these impossible choices on teens is that we're racing against their diminishing ability to learn.
This argument may hold for things like languages or thinking habits, or other skills that take root early, but having tackled an undergrad maths syllabus at both ages 18 and 28, I've found an adult work ethic beats the pants off youthful 'plasticity' any day of the week. Any skillset mandatory to a specialised vocation will probably mostly be learned well into adulthood anyway.
I suspect you would find it very difficult if you continued to adjust your speech patterns to accomodate every irregular use of the English language you'd heard since the day you were born. Your ability to rapidly learn language stopped for a reason.
I'm tempted to call this a just-not-so story.
Not only do I disagree with the general point (about "rigidity" being advantageous), but my sense is that language is probably one of the worst examples you could have used to support this position.
It strikes me as wrong on at least 4 different levels, which I shall list in increasing order of importance:
(1) I don't think it would be particularly difficult at all. (I.e. I see no advantage in the loss of linguistic ability.)
(2) People probably do continue to adjust their speech patterns throughout their lives.
(3) Children do not "accommodate every irregular use [they have] heard since the day [they] were born". Instead, their language use develops according to systematic rules.
(4) There is a strong prior against the loss of an ability being an adaptation -- by default, a better explanation is that there was insufficient selection pressure for the ability to be maintained (since abilities are usually costly).
So, unless you're basing this on large amounts of data that I don't know about, I feel obliged to wag my finger here.
how completely ridiculous it is to ask high school students to decide what they want to do with the rest of their lives
One common answer to that is to become a dropout, try a career or two to find out where your talents really lie, and then go for that. You can usually go back to school for an education when you've figured which one you need.
It doesn't even seem as if it would be very hard to build that right into the system. Doing it the artisanal way takes longer, generates more stress, loses more income.
Tentatively, thinking of my own experience, I'd point to the competitiveness of the system as the driving force. I had some smarts but school didn't suit me much. There were a bunch of things I was interested in - computers, AI, writing sci-fi, evolutionary biology - and I had no clear idea what I should do when I turned 18.
My parents' reasoning was "Most of your interests are scientific, so, the best way to keep your options open is to enrol in the top engineering schools, then you can have your pick of careers later". One problem with that is that these schools aren't a place for learning while you keep your options open. They are, basically, a sorting process, get...
When I get insignificant amounts of change, like a nickel, I leave it on the nearest outdoor object, thus teaching people that they can collect small amounts of money by searching random objects.
I randomly assault people who wander outside of the city centre, but only if they look strong enough to kill me easily.
For my part, I force rabid dogs to swallow gold coins. Defeating an aggressive dog ought to earn you both experience and precious metals.
I do that too. Also, I had a sign by my front door that read "no plot here" so that wandering adventurers aren't tempted to investigate too deeply and make a mess of my house.
I once decided to search my sofa and found a pair of nunchaku there.
Whenever anyone comes up to me but doesn't say anything, I repeat the same phrase over and over again.
Games often fall into the trap of optimizing for addictiveness which is not quite the same thing as pleasure. Jonathan Blow has talked about this and I think there is a lot of merit in his arguments:
He clarified, "I’m not saying [rewards are] bad, I’m saying you can divide them into two categories – some are like foods that are naturally beneficial and can increase your life, but some are like drugs."
Continued Blow, "As game designers, we don’t know how to make food, so we resort to drugs all the time. It shows in the discontent at the state of games – Radosh wanted food, but Halo 3 was just giving him cheap drugs."
...
Blow believes that according to WoW, the game's rules are its meaning of life. "The meaning of life in WoW is you’re some schmo that doesn’t have anything better to do than sit around pressing a button and killing imaginary monsters," he explained. "It doesn’t matter if you’re smart or how adept you are, it’s just how much time you sink in. You don’t need to do anything exceptional, you just need to run the treadmill like everyone else."
I work in the games industry and I see this pattern at work a lot from many designers.
And that great mathematical analysis is being directed at solving meaningless made-up problems that generate no value for the world. It's pure consumption, zero production. Yet it's complicated, goal-oriented consumption, it feels like doing work, and hence scratches the productive itch for many people...without actually doing any good in the world.
It's a powerful opiate (a drug which makes the time pass pleasantly and users wish to use all the time, as opposed to psychedelics which are used occasionally and make the rest of your life better). Which, I believe, makes it on the side of evil not of good.
Here's one that's particularly sinister, and shows up in nearly every RPG and MMORPG:
Progress is tied to presence, not performance.
In these games, as long as you're there, in front of the screen, you're making progress. Skill is almost never involved in success - if you can't beat that boss you just need to go kill a few thousand boars and level up a little bit, and if you want that sweet equipment you just have to put in the hours to grind for gold, or faction, or honor, or whatever.
In the real world, getting better at something generally takes actual work, and only occurs under specific conditions of deliberate practice and proper feedback. But it's so easy to fall into the trap of "hey, I'm doing something tangentally related to goal x or skill z, I must be making progress at it".
Well, that's not entirely unrealistic. As Woody Allen said, half of life is just showing up. (Ask Eliezer what he thinks about school...)
Blizzard's own Starcraft is competitive and very fast-paced, and yet it has continent-wide servers all the same.
A better reason for the perverse nature of MMOs is that the promise of guaranteed progress, especially combined with social obligations, is much more effective at keeping people paying their monthly fees than the hope of personal improvement.
It's interesting to see what happens when videogames behave more like real life. For instance, in Oblivion (and Fallout 3), you can't just take things unless you're in the middle of nowhere. If someone sees you, they cry out "stop, thief!". Equally, attacking people who didn't attack you first in civilised areas will draw the guard or vigilantes down on your head, and most of the stuff you find lying around is worthless trash that isn't worth the effort to haul away and sell.
I remember how jarring it was when I first tried to take something in Oblivion, only for a bystander to call for the guard. And then I realised that this is how NPCs should react to casual theft.
The most insidious of these misguiding heuristics have, apparently due to their transparency (like water to a fish), gone unmentioned so far in this thread.
Typical game play shares much in common with typical schooling. Children are inculcated with impressions of a world of levels that can (and should) be ascended through mastery of skills corresponding to challenges presented to them at each level, with right action leading to convergence on right answers, within an effectively fixed and ultimately knowable context.
Contrast this with the "real world", where challenges are not presented but encountered, where it's generally better to do the right thing than to do things right, within a diverging context of increasing uncertainty.
One thing this essay does not address is whether humans actually are likely to learn heuristics from playing videogames or whether a large enough fraction of the population plays videogames for this to be a real concern.
Let's briefly address that: There's a fair bit of evidence that much of "play" behavior over a wide variety of species is specifically to learn behavior and rules for actual life events. Thus for example, wolf cubs engage in mock fights which prepare them for more serious events. Some species of corvids (crows, ravens, jays, etc.) will actively play with the large predators in their area (pecking at their tails for example or dropping objects on their faces) which is an apparent attempt to learn about the general behavior of the predators which is primarily important because these species of corvids get much of their food from scavenging. It is likely that humans engage in play behavior in part for similar reasons. If so, there's a real danger of people learning bad heuristics from videogames.
What percentage of the population plays videogames? A quick Google search turns up various numbers which disagree but it seems that they vary from around a third to slightly over half. See for example here. Given that, this seems like a common enough issue to be worth discussing.
I agree with everything you said. We should be especially cautious about playing so-called casual games, where the modal player very quickly reaches a plateau beyond which he or she will learn nothing, much less learn anything useful.
The difference of course is that the learning process in Real Life is slooooow. In game X, after 30 hours of play, the modal player may be one or two orders of magnitude better at a given skill (one that is at least somewhat unique to the game) than someone who has been playing for two hours. Some games (e.g., some first-person shooters) require no unique skills; I suspect the skill curve looks similar, but most players are hundreds of hours along it rather than just a few, so the curve appears flatter and the differences proportionally smaller.
Contrast that to life: in mine, the skills I am trying to cultivate are the same ones that I've been trying to cultivate for years, and my improvement is sometimes so glacial that I feel doubt as to whether I'm getting better at all. I could just be thousands of hours along similarly shaped curves, but I have certainly reached the point where I no longer see incremental improvement: all I see anymore are occasio...
As I was reading this, I realized that many of the points here apply heartily to single-player games, but the reverse is often true of MMOs.
A while back I spent a few years playing World of Warcraft, and ended up doing mid to high level raids.
When leveling, or completing a raid, you do know your purpose, and it is handed down from on high. This is unrealistic, but possibly one of the most relaxing aspects of escapism.
You DO NOT delay or take forever! While leveling or raiding, it is important to do things efficiently to meet your goals quickly. You want to hit max level ASAP, not see the whole low-level world; you want to see the whole high-level world.
When leveling or raiding, there is usually a specific build that is more powerful than the others. You have choices between various builds, but never more than 3 per character class, and usually the 3 are vastly different and you must choose one of them. For example, every rogue ever would take a talent that gives them +5 attack speed, but taking a bonus to speed while in stealth would get you kicked out of a hardcore guild.
In raiding, the difficulty isn't (strictly) progressive. Some fights are easier, some are harder. Some a...
I don't play a lot of video games, but I'm quite fond of strategy, and have recently become besotted with Starcraft 2. Something that struck me while looking through the online strategy community was how ruthlessly empirical that community was.
It shouldn't be too surprising. Players are provided with an environment governed by the immutable laws of the game engine, and the only objective is to win. You can accomplish this however you like: efficient micromanagement, complementary unit selection, economic superiority, stealth tactics, mind games, aggressive map dominance, off-the-wall strategy your opponent can't plan for...however you manage it, provided you're the one left standing at the end, you win.
As a result, players continually test the behaviour of the environment, see what works and throw away what doesn't. This often involves setting up contrived scenarios to explicitly test certain theories. The result is a massive body of knowledge on how to effectively win the game.
I would say that it's kind of heartening to find that when given proper incentive, even people with (presumably) no formal scientific training can apply systematic methods for testing the behaviour of their environment, but I don't know what kind of crossover exists between the SC/scientific community.
Getting good isn't-- see Talent Is Overrated for details about the 10,000 hours to mastery theory.
People tend to prefer grinding over developing relevant sub-skills by experimentation, but the latter is what works.
On the other hand, good videogames can be a cool tool for low-risk self-improvement.
I've historically had a lot of trouble focusing on one thing at a time - choosing a major, minimizing my areas of focus. I recently played KOTOR, and realized that I play videogames the exact same way. I can never commit to one class/alignment/weapon specialization at a time, and I suffer for it.
Recognizing the similarities, I decided to play KOTOR as a specialist in one alignment, one class, and one weapon type, ignoring tantalizing opportunities to generalize whenever possible. I ended up enjoying the game a lot more than I usually do.
Three weeks later I chose my major, and I honestly believe KOTOR helped.
I would like to commend you for taking the time to include the penultimate paragraph. I think it extremely worth pointing out that not everything that happens in games is likely to manifest in seemingly-analogous real world decisions.
The good news about most of these biases is that they are quite testable. I would love to see some research about the decision making processes of video game enthusiasts (particularly those who started at an early age) and a control group.
Thanks to Ralith, Lark, neptunepink and nshepperd for their feedback on the first draft of this article in the #lesswrong IRC channel. The IRC channel is a good way to get early feedback on posts, and discussing it there revealed several important flaws in the writing that have been fixed.
Hello, player character, and welcome to the Mazes of Menace!
I'm surprised that you didn't mention NetHack, and that nobody else has either, given that it contains the Mazes of Menace and provides counterexamples to many of your points.
Because your goal is not to reach the end quickly
In NetHack, the goal of beginners is to ascend, i.e. win, and it is very difficult. (I have not yet ascended; the furthest I've gotten is level 27, with 422434 points.) The goal of intermediate players is to ascend quickly. And the goal of advanced players is to ascend u...
Similarly, if you're on a quest to save the world, you do side-quests to put it off as long as possible
I've explicitly made note this fact, that one should do quests in exactly reverse order of importance, in every cRPG I've ever played. Because often making progress on major quests will change the game (lock you out of an area, say, or kill an NPC) such that you can no longer complete some minor quests if you haven't done them already .
Modern designers have finally started to take account of this. In Mass Effect 2, you do almost all of your side-questing while you wait for your employer to gather information about the main problem. Once the party does get started, the game makes it emphatically clear that waiting any more than absolutely necessary is going to severely compromise your primary mission.
I guess I might as well post about my own experiences, even though I'm probably not a typical game player:
I noticed myself developing the habit of seeking the dead ends first in video games, but I thought that it was just a bad habit that I developed, and that most other people don't play like that. My brother doesn't play like that. But I continue using this strategy even in games where there isn't a reward at the dead ends. I deliberately choose the path that's more likely to be a dead end first, just for my own peace of mind, to know that it's a dea...
Good post. One other thing that should be said has to do with the /why/. Why do we design many games like this? There are some obvious reasons: it's easier, it's fun, it plays on our natural reward mechanisms, etc. A perhaps less obvious one: it reflects the world as many /wish it could be/. Straightforward; full of definite, predefined goals; having well known, well understood challenges; having predictable rewards that are trivial to compare to others; having a very linear path for "progression" (via leveling up, attribute increases, etc.) A world with a WHOLE lot less variables.
Back around 1990, there was a school of game design that said that a game should be immersive, and to be immersive, it should stop reminding you that it's a game by making you throw away all real-life conventions. So this school of game design said things like:
You should not have to examine everything in the game. You should do just fine in the game by examining only objects that a reasonable person would examine.
You should not have to die in order to learn something needed for the game.
You should usually be punished for theft, breaking and enterin
In the real world, getting rid of junk costs money in effort and disposal fees instead.
In the real world, you can sell your old stuff. People just don't. Perhaps games can teach them that it is a good idea, even if it's for a fraction of the price you bought it for.
I'd say the worst habit of thought promoted by computer games is that if you do something disastrously foolish or clumsy, you can conveniently restart from a recently saved position. Clearly, that doesn't help one develop a good attitude towards the possibility of blunders in real life. (Though I should add that I haven't played any computer games in almost a decade, and I don't know if the basic concepts have changed since then.)
I'm rather amused to be reading this for the first time while wearing my 'Things You Learn From Video Games' shirt...
Hmm, I've never really confused my life with in-game life before, but I wonder if I maybe do it on the subconscious level. An interesting note: when I tried playing Morrowind (for those who don't know, that game has a huge open world with many huge areas that are there for no reason other than to add realism), I had a sort of paralysis, because I had to explore every room and open every door, but that's simply impossible in that game.
It should be noted that some of these seem specific to games with a levelling/upgrade system, and in particular ones that you don't know in advance / are not really intended for replay.
For most games, there's a guide that explains exactly how to complete your objective perfectly, but to read it would be cheating. Your goal is not to master the game, but to experience the process of mastering the game as laid out by the game's designers, without outside interference. In the real world, if there's a guide for a skill you want to learn, you read it.
This doesn't sound like how people actually use them?
If it's a puzzle, then sure, figuring it out yourself can be fun. But if you get stuck and want to move on...then don't you pull out a guide?
(...
Applying this reasoning to the real world would mean choosing a career without bothering to find out what sort of salary and lifestyle it supports; but things in the real world are almost never balanced in this sense. (Many people, in fact, do not do this research, which is why colleges turn out so many English majors.)
It would have been worth noting that there are valid criteria for choice of university courses other than these. As it is, this section looks rather philistine.
If you think people believe that RPG classes are balanced, you obviously haven't spent much time reading game forums! "Imba"ness, real or perceived, is probably the #1 topic of discussion for most multiplayer games.
Hello, player character, and welcome to the Mazes of Menace! Your goal is to get to the center and defeat the Big Bad. You know this is your goal because you received a message from a very authoritative source that said so. Alas, the maze is filled with guards and traps that make every step dangerous. You have reached an intersection, and there are two doors before you. Door A leads towards the center; it probably takes you to your destination. Door B leads away from the center; it could loop back, but it's probably a dead end. Which door do you choose?
The correct answer, and the answer which every habitual video game player will instinctively choose, is door B: the probable dead end. Because your goal is not to reach the end quickly, but to search as much of the maze's area as you can, and by RPG genre convention, dead ends come with treasure. Similarly, if you're on a quest to save the world, you do side-quests to put it off as long as possible, because you're optimizing for fraction-of-content-seen, rather than probability-world-is-saved, which is 1.0 from the very beginning.
If you optimize for one thing, while thinking that you're optimizing something else, then you may generate incorrect subgoals and heuristics. If seen clearly, the doors represent a trade-off between time spent and area explored. But what happens if that trade-off is never acknowledged, and you can't see the situation for what it really is? Then you're loading garbage into your goal system. I'm writing this because someone reported what looks like a video game heuristic leaking into the real world. While this hasn't been studied, it could plausibly be a common problem. Here are some of the common memetic hazards I've found in video games.
For most games, there's a guide that explains exactly how to complete your objective perfectly, but to read it would be cheating. Your goal is not to master the game, but to experience the process of mastering the game as laid out by the game's designers, without outside interference. In the real world, if there's a guide for a skill you want to learn, you read it.
Permanent choices can be chosen arbitrarily on a whim, or based solely on what you think best matches your style, and you don't need to research which is better. This is because in games, the classes, skills, races and alignments are meant to be balanced, so they're all close to equally good. Applying this reasoning to the real world would mean choosing a career without bothering to find out what sort of salary and lifestyle it supports; but things in the real world are almost never balanced in this sense. (Many people, in fact, do not do this research, which is why colleges turn out so many English majors.)
Tasks are arranged in order of difficulty, from easiest to hardest. If you try something and it's too hard, then you must have taken a wrong turn into an area you're not supposed to be. When playing a game, level ten is harder than level nine, and a shortcut from level one to level ten is a bad idea. Reality is the opposite; most of the difficulty comes up front, and it gets easier as you learn. When writing a book, chapter ten is easier than writing chapter nine. Games teach us to expect an easy start, and a tough finale; this makes the tough starts reality offers more discouraging.
You shouldn't save gold pieces, because they lose their value quickly to inflation as you level. Treating real-world currency that way would be irresponsible. You should collect junk, since even useless items can be sold to vendors for in-game money. In the real world, getting rid of junk costs money in effort and disposal fees instead.
These oddities are dangerous only when they are both confusing and unknown, and to illustrate the contrast, here is one more example. There are hordes of creatures that look just like humans, except that they attack on sight and have no moral significance. Objects which are not nailed down are unowned and may be claimed without legal repercussions, and homes which are not locked may be explored. But no one would ever confuse killing an NPC for real murder, nor clicking an item for larceny, nor exploring a level for burglary; these actions are so dissimilar that there is no possible confusion.
But remember that search is not like exploration, manuals are not cheats, careers are not balanced, difficulty is front-loaded, and dollars do not inflate like gold pieces. Because these distinctions are tricky, and failing to make them can have consequences.