A Gamification Of Education: a modest proposal based on the Universal Decimal Classification and RPG skill trees

13 Ritalin 07 July 2013 06:27PM

While making the inventory of my personal library and applying the Universal Decimal System to its classification, I found myself discovering a systematized classification of fields of knowledge, nested and organized and intricate, many of which I didn't even know existed. I couldn't help but compare how information was therein classified, and how it was imparted to me in engineering school. I also thought about how, often, software engineers and computer scientists were mostly self-thought, with even college mostly consisting of "here's a problem: go forth and figure out a way to solve it". This made me wonder whether another way of certified and certifiable education couldn't be achieved, and a couple of ideas sort of came to me.

It's pretty nebulous in my mind so far, but the crux of the concept would be a modular structure of education, where the academic institution essentially established what information precisely you need from each module, and lets you get on with the activity of learning, with periodic exams that you can sign up for, which will certify your level and area of proficiency in each module.

A recommended tree of learning can be established, but it should be possible to not take every intermediate test, if passing the final test proves that you've passed all the others behind it (this would allow people coming from different academic systems to certify their knowledge quickly and easily, thus avoiding the classic "Doctor in Physics from Former Soviet Union, current Taxi Driver in New York" scenario).

Thus, a universal standard of how much you have proven to know about what topics can be established.

Employers would then be free to request profiles in the format of such a tree. It need not be a binary "you need to have done all these courses and only these courses to work for us", they could be free to write their utility function for this or that job however they would see fit, with whichever weights and restrictions they would need.

Students and other learners would be free to advance in whichever tree they required, depending on what kind of profile they want to end up with at what age or point in time. One would determine what to learn based on statistical studies of what elements are, by and large, most desired by employers of/predictors of professional success in a certain field you want to work in.

One would find, for example, that mastering the peculiar field of railway engineering is great to be a proficient railway engineer, but also that having studied, say, things involved with people skills (from rhetoric to psychology to management), correlates positively with success in that field.

Conversely, a painter may find that learning about statistics, market predictions, web design, or cognitive biases correlates with a more successful career (whether it be on terms of income, or in terms of copies sold, or of public exposure... each one may optimize their own learning according to their own criteria).

One might even be able to calculate whether such complimentary education is actually worth their time, and which of them are the most cost-efficient.

I would predict that such a system would help society overall optimize how many people know what skills, and facilitate the learning of new skills and the updating of old ones for everyone, thus reducing structural unemployment, and preventing pigeonholing and other forms of professional arthritis.

I would even dare to predict that, given the vague, statistical, cluster-ish nature of this system, people would be encouraged to learn quite a lot more, and on a quite wider range of fields, than they do now, when one must jump through a great many hoops and endure a great many constraints in space and time and coin to get access to some types of educations (and to the acknowledgement of their acquisition thereof).

Acquiring access to the actual sources of knowledge, a library (virtual or otherwise), lectures (virtual or otherwise), and so on, would be a private matter, up to the learner:

  • some of them already have the knowledge and just need to get it certified,
  • others can actually buy the books they want/need, especially if keeping them around as reference will be useful to them in the future,
  • others can subscribe to one or many libraries, of the on-site sort or by correspondence
  • others can buy access to pre-recorded lectures, peruse lectures that are available for free, or enroll in academic institutions whose ostensible purpose is to give lectures and/or otherwise guide students through learning, more or less closely
  • the same applies to finding study groups with whom you can work on a topic together: I can easily imagine dedicated social networks could be created for that purpose, helping people pair up with each other based on mutual distance, predicted personal affinity, mutual goals, backgrounds, and so on. Who knows what amazing research teams might be borne of the intellectual equivalent of OK!Cupid.

A thing that I would like very much about this system is that it would free up the strange conflicts of interest that hamper the function of traditional educational institutions.

When the ones who teach you are also the ones who grade you, the effort they invest in you can feel like a zero-sum game, especially if they are only allowed to let a percentage of you pass.

When the ones who teach you have priorities other than teach (usually research, but some teachers are also involved in administrative functions, or even private interests completely outside of the university's ivory tower1), this can and often does reduce the energy and dedication they can/will allocate to the actual function of teaching, as opposed to the others.

By separating these functions, and the contradictory incentives they provide, the organizations performing them are free to optimize for each: 

  • Testing is optimized for predicting current and future competence in a subject: the testers whose tests are the most reliable have more employers requiring their certificates, and thus more people requesting that they test them
  • Teaching is optimized for getting the knowledge through whatever the heck the students want, whether it be to succeed at the tests or to simply master the subject (I don't know much game theory, but I'd naively guess that the spontaneous equilibrium between the teaching and testing institutions would lead to both goals becoming identical).
  • Researching is optimized for research (researchers are not teachers. dang it, those are very different skill-sets!). However researchers and other experts get to have a pretty big say in what the tests test for and how, because their involvement makes the tests more trustworthy for employers, and because they, too, are employers.
  • And of course entire meta-institutions can spring from this, whose role is to statistically verify, over the long term,
    • how good a predictor of professional success in this or that field is passing the corresponding test, and
    • how good a predictor of passing the test is to be taught by this or that teaching institution.
    • how good a predictor of the test being reliable is the input of these or those researchers and experts
  • It occurs to me now that, if one wished to be really nitpicky about who watches the watchmen, I suspect that there would be institutions testing the reliability of those meta-institutions, and so on and so forth... When does it stop? How to avoid vested interests and little cheats and manipulations pulling an academic equivalent of the AAA certification of sub-prime junk debt in 2008?

Another discrepancy I'd like to see solved is the difference between the official time it is supposed to take to obtain this or that degree, to learn this or that subject, and the actual statistical distribution of that time. Nowadays, a degree that's supposed to take you five years ends up taking up eight or ten years of your life. You find yourself having to go through the most difficult subjects again and again, because they are explained in an extremely rushed way, the materials crammed into a pre-formatted time. Other subjects are so exceedingly easy and thinly-spread that you find that going to class is a waste of time, and that you're better off preparing for it one week before finals. Now, after having written all of the above, my mind is quite spent, and I don't feel capable of either anticipating the effect of my proposed idea on this particular, nor of offering any solutions. Nevertheless, I wish to draw attention to this, so I'm leaving this paragraph in until I can amend it to something more useful/promising.

I hereby submit this idea to the LW community for screening and sound-boarding. I apologize in advance for your time, just in case this idea appears to be flawed enough to be unsalvageable. If you deem the concept good but flawed, we could perhaps work on ironing those kinks together. If, afterwards, this seems to you like a good enough idea to implement, know that good proposals are a dime a dozen; if there is any interest in seeing something like this happen, we can need to move on to proprely understanding the current state of secondary/superior/higher education, and figuring out of what incentives/powers/leverages are needed to actually get it implemented.

 



 

1By ivory tower I simply mean the protected environment where professors teach, researchers research, and students study, with multiple buffers between it and the ebb and flow of political, economical, and social turmoil. No value judgement is intended.

 


 

EDIT: And now I look upon the title of this article and realize that, though I had comparisons to games in mind, I never got around to writing them down. My inspirations here were mostly Civilization's Research Trees, RPG Skill Scores and Perks, and, in particular, Skyrim's skills and perks tree.

Basically, your level at whatever skill improves by studying and by practising it rather than merely by levelling up, and, when you need to perform a task that's outside your profile, you can go and learn it without having to commit to a class. Knowing the right combination of skills at the right level lets you unlock perks or access previously-unavailable skills and applications. What I like the most about it is that there's a lot of freedom to learn what you want and be who you want to be according to your own tastes and wishes, but, overall, it sounds sensible and is relatively well-balanced. And of course there's the fact that it allows you to keep a careful tally of how good you are at what things, and the sense of accomplishment is so motivating and encouraging!

Speaking of which, several netwroks and consoles' Achievement systems also strike me as motivators for keeping track of what one has achieved so far, to look back and be able to say "I've come a long way" (in an effect similar to that of gratitude journals), and also to accomplish a task and have this immediate and universal acknowledgement that you did it dammit (and, for those who care about that kind of thing, the chance to rub it the face of those who haven't).

I would think our educational systems could benefit from this kind of modularity and from this ability to keep track of things in a systematic way. What do you guys think?

Complexity: inherent, created, and hidden

8 Swimmer963 14 September 2011 02:33PM

Related to: inferential distance, fun theory sequence.

“The arrow of human history…points towards larger quantities of non-zero-sumness. As history progresses, human beings find themselves playing non-zero-sum games with more and more other human beings. Interdependence expands, and social complexity grows in scope and depth.” (Robert Wright, Nonzero: The Logic of Human Destiny.)

What does it mean for a human society to be more complex? Where does new information come from, and where in the system is it stored? What does it mean for everyday people to live in a simple versus a complex society?

There are certain kinds of complexity that are inherent in the environment: that existed before there were human societies at all, and would go on existing without those societies. Even the simplest human society needs to be able to adapt to these factors in order to survive. For example: climate and weather are necessary features of the planet, and humans still spend huge amounts of resources dealing with changing seasons, droughts, and the extremes of heat and cold. Certain plants grow in certain types of soil, and different animals have different migratory patterns. Even the most basic hunter-gatherer groups needed to store and pass on knowledge of these patterns. 

But even early human societies had a lot more than the minimum amount of knowledge required to live in a particular environment. Cultural complexity, in the form of traditions, conventions, rituals, and social roles, added to technological complexity, in the form of tools designed for particular purposes. Living in an agricultural society with division of labour and various different social roles required children to learn more than if they had been born to a small hunter-gatherer band. And although everyone in a village might have the same knowledge about the world, it was (probably) no longer possible for all the procedural skills taught and passed on in a given group to be mastered by a single person. (Imagine learning all the skills to be a farmer, carpenter, metalworker, weaver, baker, potter, and probably a half-dozen other things.)

This would have been the real beginning of Robert Wright’s interdependence and non-zero-sum interactions. No individual could possess all of the knowledge/complexity of their society, but every individual would benefit from its existence, at the price of a slightly longer education or apprenticeship than their counterparts in hunter-gather groups. The complexity was hidden; a person could wear a robe without knowing how to weave it, and a clay bowl without knowing how to shape it or bake it in a kiln. There was room for that knowledge in other people’s brains. The only downside, other than slightly longer investments in education, was a small increase in inferential distance between individuals.

Writing was the next step. For the first time, a significant amount of knowledge could be stored outside of anyone’s brain. Information could be passed on from one individual, the writer, to a nearly unbounded number of others, the readers. Considering the limits of human working memory, significant mathematical discoveries would have been impossible before there was a form of notation. (Imagine solving polynomial equations without pencil and paper.) And for the first time, knowledge was cumulative. An individual no longer had to spend a number of years mastering a particular, specific skill in an apprenticeship, having to laboriously pass on any new discoveries one at a time to their own apprentices. The new generation could start where the previous generation had left off. Knowledge could stay alive indefinitely, almost, in writing, without having to pass through a continuous line of minds. (Without writing, even if the ancient Greek society had possessed equivalent scientific and mathematical knowledge, it could not have later been rediscovered by any other society.) Conditions were ripe for the total sum of human knowledge to explode, and for complexity to increase rapidly.

The downside was a huge increase in inferential distance. For the first time, not only could individuals lack a particular procedural skill, they might not even know that the skill existed. They might not even benefit from the fact of its existence. The stock market contains a huge amount of knowledge and complexity, and provides non-zero-sum gains to many individuals (as well as zero-sum gains to some individuals). But to understand it requires enough education and training that most individuals can’t participate. The difference between the medical knowledge of professionals versus uneducated individuals is huge, and I expect that many people suffer because, although someone knows how they could avoid or solve their medical problems, they don’t.  Computers, aside from being really nifty, are also incredibly useful, but learning to use them well is challenging enough that a lot of people, especially older people, don’t or can’t.

(That being said, nearly everyone in Western nations benefits from living here and now, instead of in an agricultural village 4000 years ago. Think of the complexity embodied in the justice system and the health care system, both of which make life easier and safer for nearly everyone regardless of whether they actually train as professionals in those domains. But people don’t benefit as much as they could.)

Is there any way to avoid this? It’s probably impossible for an individual to have even superficial understanding in every domain of knowledge, much less the level of understanding required to benefit from that knowledge. Just keeping up with day-to-day life (managing finances, holding a job, and trying to socialize in an environment vastly different from the ancestral one) can be trying, especially for individuals on the lower end of the IQ bell curve. (I hate the idea of intelligence, something not under the individual’s control and thus unfair-seeming, being that important to success, but I’m pretty sure it’s true.) This might be why so many people are unhappy. Without regressing to a less complex kind of society, is there anything we can do?

I think the answer is quite clear, because even as societies become more complex, the arrow of daily-life-difficulty-level doesn’t always go in the same direction. There are various examples of this; computers becoming more user-friendly with time, for example. But I’ll use an example that comes readily to mind for me: automated external defibrillators, or AEDs.

A defibrillator uses electricity to interrupt an abnormal heart rhythm (ventricular fibrillation is the typical example, thus de-fibrillation). External means that the device acts from outside the patient’s body (pads with electrodes on the skin) rather than being implanted. Most defibrillators require training to use and can cause a lot of harm if they’re used wrong. The automated part is what changes this. AEDs will analyze a patient’s heart rhythm, and they will only shock if it is necessary. They have colorful diagrams and recorded verbal instructions. There’s probably a way to use an AED wrong, but you would have to be very creative to find it. Needless to say, the technology involved is ridiculously complex and took years to develop, but you don’t need to understand the science involved in order to use an AED. You probably don’t even need to read. The complexity is neatly hidden away; all that matters is that someone knows it. There weren't necessarily any ground-breaking innovations involved, just the knowledge of old inventions in a user-friendly format.

The difference is intelligence. An AED has some limited artificial intelligence in it, programmed in by people who knew what they were talking about, which is why it can replace the decision process that would otherwise be made by medical professionals. A book contains knowledge, but has to be read and interpreted in its entirety by a human brain. A device that has its own small brain doesn’t. This is probably the route where our society is headed if the arrow of (technological) complexity keeps going up. Societies need to be livable for human beings.

That being said, there is probably such thing as too much hidden complexity. If most of the information in a given society is hidden, embodied by non-human intelligences, then life as a garden-variety human would be awfully boring. Which could be the main reason for exploring human cognitive enhancement, but that’s a whole different story.

Eric Drexler on Learning About Everything

30 Vladimir_Nesov 27 May 2009 12:57PM

Related to: The Simple Math of Everything, Your Strength as a Rationalist, Teaching the Unteachable.

Eric Drexler wrote a couple of articles on the importance and methods of obtaining interdisciplinary knowledge:

Note that the title above isn't "how to learn everything", but "how to learn about everything". The distinction I have in mind is between knowing the inside of a topic in deep detail — many facts and problem-solving skills — and knowing the structure and context of a topic: essential facts, what problems can be solved by the skilled, and how the topic fits with others.

This knowledge isn't superficial in a survey-course sense: It is about both deep structure and practical applications. Knowing about, in this sense, is crucial to understanding a new problem and what must be learned in more depth in order to solve it.

This topic was discussed intermittently on Overcoming Bias. Basic understanding of many fields allows to recognize how well-understood by science a problem is and to see its place in the structure of scientific knowledge; to develop better intuitive grasp on what's possible and what's not; and to adequately perceive the natural world.

The advice he gives for obtaining general knowledge feels right, even for studying the topics that you intend to eventually understand in depth:

Don't drop a subject because you know you'd fail a test — instead, read other half-understandable journals and textbooks to accumulate vocabulary, perspective, and context.