Testing is optimized for predicting current and future competence in a subject:
What if this finds that the best predictors for most employers consists of two tests (1) a <10 minute IQ test, and (2) a test showing a student's willingness to submit to authority and learn what he was told, in the manner under which he was told without getting too far ahead or behind his classmates?
I am willing to bet money (up to €100, I am a student) that these will not, in fact, be the best predictors, just as grades in college have been proven not to be good predictors of how well a Google employee does, IQ tests have been proven not to be a good predictor of how well a salesman does, and higher salaries have proven to, past a certain point, be detrimental to productivity for employees whose jobs require creativity, imagination or otherwise intellectual exertion.
My hypothesis is that a <10 min IQ test will be so inaccurate as to be useless, and that the current school system already favors people who do what they are told as they are told without getting too far ahead or behind, with the destructive and wasteful results we all know: they are the ones it grades best, and yet, time and again, tested against the workplace, those grades have proven little.
IQ test scores are massively correlated with workplace performance.
I find it plausible that among the set of people Google has hired grades don't predict workplace success, but I bet if Google were to randomly select employees from among the U.S. population (or even from just among computer science majors) grades would be hugely correlated with performance.
in my own experience and what I could get from the people that frequent Lesswrong and TVTropes, a high IQ commonly results in [...] a miserable social life, and a boatpload of akrasia [...] Perhaps there's a selection bias and only the incompetent brains have the time to hang out here[?]
There is massive selection bias going on here.
I don't think it has as much to do with free time as with target audience, though. LW attracts a few different clusters of people, but the ones you're seeing in this context are those who feel their thinking is flawed in some way, and who believe they have a decent chance of fixing it with a cognitive science toolset and vocabulary. The site's native idiom and interaction style -- basically a founder effect -- imposes a few more filters, underrepresenting some problems and overrepresenting others. Akrasia and social problems are precisely the issues I'd expect to see a lot of, given those constraints.
TV Tropes... well, that might have more to do with free time. Almost everyone likes media, but if you want to make many original contributions, you need unusual knowledge of media and an analytical attitude towards it. Moreover, the media best represented there tend to be the most time-consuming ones -- TV, anime, doorstopper fantasy novels. I don't think I need to go into too much detail regarding the people most likely to share those requirements.
Offtopic post, but a discussion I wish to pursue nonetheless:
Regarding TVT: it used to be so. Nowadays school study media and the "literary canon" are beginning to find their way in... and all those tropes with silly names, built from mass media, are proving their usefulness as tools of analysis. Of course, getting a movie adaptation or a TV miniseries is one of the best ways to draw troper attention to a work, but classics always end up getting those with some regularity. So let's just say that the user base has widened. Oh, and many classics are as doorstoppery as modern fantasy sagas, especially stuff from the XIXth century, when novels where published as long-running serials in magazines and authors were paid by the word. When people call Eliezer Yudkowsky a terrible writer because of MoR's lack of tightness or his using it as a vehicle for ideas and lectures, I feel half-tempted to point at the likes of Victor Hugo or Alexandre Dumas or Dickens or Benito Pérez Galdós, just to name a few... surely if those are the traits of terrible writing, it means that those books have room for improvement, if only by way of abridging them?
Regarding LW: The filters the site imposes...
That is a dangerous state for a rationalist to be in. so would you please be a dear and have a look at that book to figure out how precisely it junk-scienced me and the rest of its readers? It would really help me out and I'd really be grateful for that.
I'm afraid I haven't read that exact popularization, but if it's drawing on Ericsson's research as it sounds like, the explanation is easy enough: Ericsson's points are valid largely because the studies are correlational, do not control for underlying factors or Matthew effects, and suffer from heavy range restriction in he's already looking at people who are selected or self-selected to be elites.
(ie. suppose someone studied MIT physicists with a mean IQ of 150 and discovered that in this group of physicists, Conscientiousness predicted better than IQ which would go on to win Nobels. This is a possible result, and what this has actually demonstrated is "you have to be incredibly brainy to be a MIT physicist in the first place, but once you've gotten that, then other things are also important; which is another way of saying that if we look at the general population, like all the people from IQ 60 to 150, IQ is the overwhelmi...
Nowadays I only believe in working as hard as possible for as long as possible, and it serves me much better.
Imagine a world where 50% of your results are genetically determined and 50% of your results are hard work. What would be the best strategy for success in that world, assuming that you already have decent genes? It would be working hard. Not working 50% hard, but working 100% hard.
Seems like you found the right strategy for the wrong reasons. You can keep the strategy; you don't have to blindly reverse your decisions.
Who knows what amazing research teams might be borne of the intellectual equivalent of OK!Cupid.
This sounds like an awesome idea. Does anyone know if such a thing already exists? If not, I would be willing to commit $500 to someone who wants to make it (assuming they already have the appropriate skill set, of course).
I'm reminded of a couple of students at a German university last year who studied all the material by dividing up the classes between them and exchanging notes, took all the exams, and passed in a few months. The university then turned around and sued them for studying too fast.
http://www.thelocal.de/education/20120703-43517.html
It is not in a University's interests to do a good job. It's like any other company: The aim is to extract the most $$$ from you while they give up the least value in return.
In highly competitive markets this leads to fairly marginal profits. But formal education is not a competitive market. Not at all. Very tightly regulated.
The competition for education comes almost entirely around costs for most people. They view education as an expense, not as an investment - which is quite reasonable when you consider the likely quality they're going to get - and aim to minimise that expense. Offer people a better education and the majority of them won't be willing to pay much for it, offer them a cheaper education though, or a faster one - which amounts to more or less the same thing....
That's the financial side of things anyway, and one of the reasons I think your id...
I'm reminded of a couple of students at a German university last year who studied all the material by dividing up the classes between them and exchanging notes, took all the exams, and passed in a few months. The university then turned around and sued them for studying too fast.
That in turn reminds me of Nick Bostrom:
As an undergraduate, I studied many subjects in parallel, and I gather that my performance set a national record. I was once expelled for studying too much, after the head of Umeå University psychology department discovered that I was concurrently following several other full-time programs of study (physics, philosophy, and mathematical logic), which he believed to be psychologically impossible.
The main issue in implementing such a system is testing and certification. If you use quizzes then there is nothing to stop the student from looking up the answers in another window. The system can still be made but the tests need to be designed so that students can't just look up the answers. The solution is applied knowledge problems.
For example instead of asking someone about the physical properties of a metal you would ask them what metal they would use for a certain construction and why. Student programmers would be required to write short scripts. S...
TL;DR- the difficulty of solving this problem is that availability of good data, NOT the lack of decent skill-set models.
So lets say you finish an outline of skills in these trees, as you suggest. Now, we want to make the statistical models that are at the core of your proposal- "One would determine what to learn based on statistical studies of what elements are, by and large, most desired by employers of/predictors of professional success in a certain field you want to work in." Where exactly do you plan to get the data to actually do this? ...
I'm starting a job in an "adaptive learning" startup soon, and many of the points you make here remind me of the things this company does or plans to do. The basic idea of the company is that it collects data about the student as he or she interacts with an electronic course, then uses this to personalize the course and make recommendations for the student's education path. This isn't quite the same as what you're suggesting, where a student independently finds educational content and then gets certified in those areas. However, there are several...
How about starting in a supportive domain that you happen to be interested in where existing taxonomies and skill progressions seem to exist?
Tech/skill trees help with the visualization process, and that goes a long way to motivation. Much of my struggle battling akrasia and learning math & beginner rationality is to try to stop imagining forbidding mountains of concepts, nomenclature and future practice, and instead focus on images ...
Going to college and getting a degree by the standard method serves as signalling; it shows that you are the person who is willing to spend a lot of resources (both time and money) on your skills.
Any method of education which doesn't take as many resources for the student will inherently subvert this, and therefore won't be accepted by employers.
Testing and certification is an extremely difficult thing to get right, and indeed I'm not sure anyone really has done this. It takes a lot of effort to write good tests. It is almost impossible to simultaneously reduce both false positives and false negatives. If the test is standardized or repeated, teachers teach to and students train for the test in preference to the subject matter. Ideally you want the test to be such that someone who prepares for the test is also preparing for the skills you wish to measure.
Are there any existing tests (e.g. bar ex...
"Doctor in Physics from Former Soviet Union, current Taxi Driver in New York" is a problem only because "Doctor in Physics from Current United States, current Taxi Driver in New York" is also a problem. No one doubts the quality of physicists from the former Soviet Union as a group. It's just that there aren't enough jobs in physics for all the PhDs we graduate.
the AAA certification of sub-prime junk debt in 2008?
nitpick: it wasn't the debt that was AAA rated, it was certain derivatives based on the debt (which is much less unreasonable. That the AAA rating was undeserved does not follow simply from the fact that the underlying morgages were sub-AAA.)
Employers looking for candidates beyond the entry level tend to be interested in a candidate's experience and work history far more than in any result of a test. Actuaries may be one exception. Off the top of my head it is the only profession I can think of where promotion within the field is gated by tests, and not just admission to the field. However I am not an actuary, so I may not fully understand how this works.
FYI: I regularly recruit, interview, and hire candidates within my field. I would love to have a test I could give to candidates to tell me ...
Absolutely love this idea, just one little comment on the watchdogs.
It occurs to me now that, if one wished to be really nitpicky about who watches the watchmen, I suspect that there would be institutions testing the reliability of those meta-institutions, and so on and so forth... When does it stop? How to avoid vested interests and little cheats and manipulations pulling an academic equivalent of the AAA certification of sub-prime junk debt in 2008?
Why do the watchdogs exist? Because business produce demand for accurate measurements of employability,...
While making the inventory of my personal library and applying the Universal Decimal System to its classification, I found myself discovering a systematized classification of fields of knowledge, nested and organized and intricate, many of which I didn't even know existed. I couldn't help but compare how information was therein classified, and how it was imparted to me in engineering school. I also thought about how, often, software engineers and computer scientists were mostly self-thought, with even college mostly consisting of "here's a problem: go forth and figure out a way to solve it". This made me wonder whether another way of certified and certifiable education couldn't be achieved, and a couple of ideas sort of came to me.
It's pretty nebulous in my mind so far, but the crux of the concept would be a modular structure of education, where the academic institution essentially established what information precisely you need from each module, and lets you get on with the activity of learning, with periodic exams that you can sign up for, which will certify your level and area of proficiency in each module.
A recommended tree of learning can be established, but it should be possible to not take every intermediate test, if passing the final test proves that you've passed all the others behind it (this would allow people coming from different academic systems to certify their knowledge quickly and easily, thus avoiding the classic "Doctor in Physics from Former Soviet Union, current Taxi Driver in New York" scenario).
Thus, a universal standard of how much you have proven to know about what topics can be established.
Employers would then be free to request profiles in the format of such a tree. It need not be a binary "you need to have done all these courses and only these courses to work for us", they could be free to write their utility function for this or that job however they would see fit, with whichever weights and restrictions they would need.
Students and other learners would be free to advance in whichever tree they required, depending on what kind of profile they want to end up with at what age or point in time. One would determine what to learn based on statistical studies of what elements are, by and large, most desired by employers of/predictors of professional success in a certain field you want to work in.
One would find, for example, that mastering the peculiar field of railway engineering is great to be a proficient railway engineer, but also that having studied, say, things involved with people skills (from rhetoric to psychology to management), correlates positively with success in that field.
Conversely, a painter may find that learning about statistics, market predictions, web design, or cognitive biases correlates with a more successful career (whether it be on terms of income, or in terms of copies sold, or of public exposure... each one may optimize their own learning according to their own criteria).
One might even be able to calculate whether such complimentary education is actually worth their time, and which of them are the most cost-efficient.
I would predict that such a system would help society overall optimize how many people know what skills, and facilitate the learning of new skills and the updating of old ones for everyone, thus reducing structural unemployment, and preventing pigeonholing and other forms of professional arthritis.
I would even dare to predict that, given the vague, statistical, cluster-ish nature of this system, people would be encouraged to learn quite a lot more, and on a quite wider range of fields, than they do now, when one must jump through a great many hoops and endure a great many constraints in space and time and coin to get access to some types of educations (and to the acknowledgement of their acquisition thereof).
Acquiring access to the actual sources of knowledge, a library (virtual or otherwise), lectures (virtual or otherwise), and so on, would be a private matter, up to the learner:
A thing that I would like very much about this system is that it would free up the strange conflicts of interest that hamper the function of traditional educational institutions.
When the ones who teach you are also the ones who grade you, the effort they invest in you can feel like a zero-sum game, especially if they are only allowed to let a percentage of you pass.
When the ones who teach you have priorities other than teach (usually research, but some teachers are also involved in administrative functions, or even private interests completely outside of the university's ivory tower1), this can and often does reduce the energy and dedication they can/will allocate to the actual function of teaching, as opposed to the others.
By separating these functions, and the contradictory incentives they provide, the organizations performing them are free to optimize for each:
Another discrepancy I'd like to see solved is the difference between the official time it is supposed to take to obtain this or that degree, to learn this or that subject, and the actual statistical distribution of that time. Nowadays, a degree that's supposed to take you five years ends up taking up eight or ten years of your life. You find yourself having to go through the most difficult subjects again and again, because they are explained in an extremely rushed way, the materials crammed into a pre-formatted time. Other subjects are so exceedingly easy and thinly-spread that you find that going to class is a waste of time, and that you're better off preparing for it one week before finals. Now, after having written all of the above, my mind is quite spent, and I don't feel capable of either anticipating the effect of my proposed idea on this particular, nor of offering any solutions. Nevertheless, I wish to draw attention to this, so I'm leaving this paragraph in until I can amend it to something more useful/promising.
I hereby submit this idea to the LW community for screening and sound-boarding. I apologize in advance for your time, just in case this idea appears to be flawed enough to be unsalvageable. If you deem the concept good but flawed, we could perhaps work on ironing those kinks together. If, afterwards, this seems to you like a good enough idea to implement, know that good proposals are a dime a dozen; if there is any interest in seeing something like this happen, we can need to move on to proprely understanding the current state of secondary/superior/higher education, and figuring out of what incentives/powers/leverages are needed to actually get it implemented.
1By ivory tower I simply mean the protected environment where professors teach, researchers research, and students study, with multiple buffers between it and the ebb and flow of political, economical, and social turmoil. No value judgement is intended.
EDIT: And now I look upon the title of this article and realize that, though I had comparisons to games in mind, I never got around to writing them down. My inspirations here were mostly Civilization's Research Trees, RPG Skill Scores and Perks, and, in particular, Skyrim's skills and perks tree.
Basically, your level at whatever skill improves by studying and by practising it rather than merely by levelling up, and, when you need to perform a task that's outside your profile, you can go and learn it without having to commit to a class. Knowing the right combination of skills at the right level lets you unlock perks or access previously-unavailable skills and applications. What I like the most about it is that there's a lot of freedom to learn what you want and be who you want to be according to your own tastes and wishes, but, overall, it sounds sensible and is relatively well-balanced. And of course there's the fact that it allows you to keep a careful tally of how good you are at what things, and the sense of accomplishment is so motivating and encouraging!
Speaking of which, several netwroks and consoles' Achievement systems also strike me as motivators for keeping track of what one has achieved so far, to look back and be able to say "I've come a long way" (in an effect similar to that of gratitude journals), and also to accomplish a task and have this immediate and universal acknowledgement that you did it dammit (and, for those who care about that kind of thing, the chance to rub it the face of those who haven't).
I would think our educational systems could benefit from this kind of modularity and from this ability to keep track of things in a systematic way. What do you guys think?