Robin Hanson has identified a breakdown in the metaphor of rationality as martial art: skillful violence can be more or less entirely deferred to specialists, but rationality is one of the things that everyone should know how to do, even if specialists do it better. Even though paramedics are better trained and equipped than civilians at the scene of a heart attack, a CPR-trained bystander can do more to save the life of the victim due to the paramedics' response time. Prediction markets are great for governments, corporations, or communities, but if an individual's personal life has gotten bad enough to need the help of a professional rationalist, a little training in "cartography" could have nipped the problem in the bud.
To put it another way, thinking rationally is something I want to do, not have done for me. I would bet that Robin Hanson, and indeed most people, respect the opinions of others in proportion to the extent that they are rational. So the individual impulse toward learning to be less wrong is not only a path to winning, but a basic value of a rationalist community.
One can think that individuals can profit from being more rational, while also thinking that improving our social epistemic systems or participating in them actively will do more to increase our welfare than focusing on increasing individual rationality.
Yes, it would be silly to think of ourselves as isolated survivalists in a society where so many people are signed up for cryonics, where Many-Worlds was seen as retrospectively obvious as soon as it was proposed, and no one can be elected to public office if they openly admit to believing in God. But let us be realistic about which Earth we actually live in.
I too am greatly interested in group mechanisms of rationality - though I admit I put more emphasis on individuals; I suspect you can build more interesting systems out of smarter bricks. The obstacles are in many ways the same: testing the group, incentivizing the people in it. In most cases if you can test a group you can test an individual and vice versa.
But any group mechanism of that sort will have the character of a band of survivalists getting together to grow carrots. Prediction markets are lonely outposts of light in a world that isn't so much "gone dark" as having never been illuminated to begin with; and the Policy Analysis Markets were burned by a horde of outraged barbarians.
We have always been in the Post-Apocalyptic Rationalist Environment, where even scientists and academics are doing it wrong and Dark Side Epistemology howls through the street; I don't even angst about this, I just take it for granted. Any proposals for getting a civilization started need to take into account that it doesn't already exist.
Sounds like you do think of yourself as an isolated survivalist in a world of aliens with which you cannot profitably coordinate. Let us know if you find those more interesting systems you suspect can be built from smarter bricks.
It's pretty hard to be isolated in a world of six billion people. The more key question is the probability of coordinating with any randomly selected person on a rationalist topic of fixed difficulty, and the total size of the community available to support some number of institutions.
To put it bluntly, if you built the ideal rationalist institution that requires one million supporters, you'd be in trouble because the 99.98th percentile of rationality is not adequate to support it (and also such rationalists may have other demands on their time).
But if you can build institutions that grow starting from small groups even in a not-previously-friendly environment, or upgrade rationalists starting from the 98th percentile to what we would currently regard as much higher levels, then odds look better for such institutions.
We both want to live in a friendly world with lots of high-grade rationalists and excellent institutions with good tests and good incentives, but I don't think I already live there.
Even in the most civilized civilizations, barbarity takes place on a regular basis. There are some homicides in dark alleys in the safest countries on earth, and there are bankruptcies, poverty, and layoffs even in the richest countries.
In the same way, we live in a flawed society of reason, which has been growing and improving with starts and fits since the scientific revolution. We may be civilized in the arena of reason in the same way you could call Northern Europe in the 900s civilized in the arena of personal security: there are rules that nearly everyone knows and that most obey to some extent, but they are routinely disrespected, and the only thing that makes people really take heed is the theater of enforcement, whether that's legally-sanctioned violence against notorious bandits or a dressing-down of notorious sophists.
Right now, we are only barely scraping together a culture of rationality, it may have a shaky foundation and many dumber bricks, but it seems a bit much to say we don't have one.
I'm not sure I can let you make that distinction without some more justification.
Most people think they're truth-seekers and honestly claim to be truth-seekers. But the very existence of biases shows that thinking you're a truth-seeker doesn't make it so. Ask a hundred doctors, and they'll all (without consciously lying!) say they're looking for the truth about what really will help or hurt their patients. But give them your spiel about the flaws in the health system, and in the course of what they consider seeking the truth, they'll dismiss your objections in a way you consider unfair. Build an institution that confirms your results, and they'll dismiss the institution as biased or flawed or "silly". These doctors are not liars or enemies of truth or anything. They're normal people whose search for the truth is being hijacked in ways they can't control.
The solution: turn them into rationalists. They don't have to be black belt rationalists who can derive Bayes' Theorem in their sleep, but they have to be rationalist enough that their natural good intentions towards truth-seeking correspond to actual truth-seeking and allow you to build your institutions without interference.
This sounds like you're postulating people who have good taste in rationalist institutions without having good taste in rationality. Or you're postulating that it's easy to push on the former quantity without pushing on the latter. How likely is this really? Why wouldn't any such effort be easily hijacked by institutions that look good to non-rationalists?
Putting so much work into talking about these things isn't the act of an isolated survivalist, though.
For example, with subsidized prediction markets, we can each specialize on the topics where we contribute best, relying on market consensus on all other topics.
If no one person has a good grasp of all the material, then there will be significant insights that are missed. Science in our era is already dominated by dumb specialists who know everything about nothing. EY's work has been so good precisely because he took the effort to understand so many different subjects. I'll bet at long odds that a prediction market containing an expert on evo-psych, an expert on each of five narrow AI specialisms, an expert on quantum mechanics, an expert on human biases, an expert on ethics and an expert on mathametical logic would not even have produced FAI as an idea to be bet upon.
We don't each need to train to identify and fix each possible kind of bias; each bias can instead have specialists who look for where that bias appears and then
correct it.
If people could see inside each others' heads and bet on (combinations of) people's thoughts, this would work.
In reality, what will happen is that a singly debiased single subject specialist will simply not produce any ideas for the predict...
Leibniz, Da Vinci, Pascal, Descartes, and John von Neumann spring immediately to mind for me.
There's also Poincaré, often considered the last universalist. Kant is famous as a philosopher, but also worked in astronomy. Bertrand Russell did work in philosophy as well as mathematics, and was something of a generalist. Noam Chomsky is the linguist of the 20th century, and if you consider any of his political and media analysis outside of linguistics to be worthwhile, he's another. Bucky Fuller. Charles Peirce. William James. Aristotle. Goethe. Thomas Jefferson. Benjamin Franklin. Omar Khayyám.
Just thought of Gauss, who in addition to his work in mathematics did considerable work in physics.
Herbert Simon: psychology and computer science (got an economics Nobel).
Alan Turing: don't know how I could have forgotten him.
Norbert Wiener.
Massive hindsight bias. Whether we, as a race, are proud of it or not, it wasn't until Darwin, only 150 years ago, that someone seriously suggested and developed it.
One problem with trusting the experts rather than trying to think things through for yourself is that you need a certain amount of expertise just to understand what the experts are saying. The experts might be able to tell you that "all symmetric matrices are orthonormally diagonalizable," and you might have perfect trust in them, but without a lot of personal study and inquiry, the mere words don't help you very much.
Following the martial arts analogy, I guess that makes Robin a supporter of "Rationalist Gangs".
One of the ways that I think that OB could have been better, and that I think LW could be more helpful, is to put a greater emphasis on practice and practical techniques for improving rationality in the writings here and to give many more real-life examples than we do.
When making a post that hints at any kind of a practical technique, posters could really make an effort to clearly identify the practical implications and techniques, to put all the practical parts together in the essay rather than mixing them throughout 15 paragraphs of justification and rea...
Robin was kind enough not to say what overemphasizing the heroic individual rationalist implies about our true motivations.
I love your thesis and metaphor, that the goal is for us all jointly to become rational, seek, and find truth. But I do not "respect the opinions of enough others." I have political/scientific disagreements so deep and frequent that I frequently just hide them and. worry. I resonated best with your penultimate sentence: "humanity's vast stunning cluelessness" does seem to be the problem. Has someone written on the consequences of taking over the world? The human genome, presumptively adapted to forward it's best interests in a co...
Maybe personal finance is a better analogy than Martial Arts. It's useful for nearly anybody to know about personal finance, yet many people are lacking even in the basics. Some high-falutin stock market concepts may not be useful to the average joe, the same way advanced rationality ("better then Einstein") may not be needed, but still, education about the basics is useful.
For whatever reason, the community here (so-called "rationalists") is heavily influenced by overly-individualistic ideologies (libertarianism, or in its more extreme forms, objectivism). This leads to ignoring entire realms of human phenomena (social cognition) and the people who have studied them (Vygotsky, sociologists of science, ethnomethodology). It's not that social approaches to cognition provide a magic bullet -- they just provide a very different perspective on how minds work. Imagine if you stop believing that beliefs are in the head ...
I am guilty as charged in being much more familiar with individualistic than socially oriented ideologies.
Why don't you write some posts about techniques or discoveries from socially-oriented science that could help rationalists?
...Martial arts can be a good training to ensure your personal security, if you assume the worst about your tools and environment. If you expect to find yourself unarmed in a dark alley, or fighting hand to hand in a war, it makes sense. But most people do a lot better at ensuring their personal security by coordinating to live in peaceful societies and neighborhoods; they pay someone else to learn martial arts. Similarly, while "survivalists" plan and train to stay warm, dry, and fed given worst case assumptions about the world around them, mos
We should learn how to identify trustworthy experts. Is there some general way, or do you have to rely on specific rules for each category of knowledge?
Two examples of rules are never trust someone's advice about which specific stocks you should buy unless the advisor has material non-public information, and be extremely skeptical of statistical evidence presented in Women Studies' journals. Although both rules are probably true you obviously couldn't trust financial advisers or Women Studies' professors to give them to you.
Another good example is the legal system. Individually it serves many participants poorly on a truth-seeking level; it encourages them to commit strongly to an initial position and make only those arguments that advance their cases, while doing everything they can to conceal their cases' flaws short of explicit misrepresentation. They are rewarded for winning, whether or not their position is correct. On the other hand, this set-up (in combined with modern liberalized disclosure rules) works fairly well as a way of aggregating all the relevant evidence ...
The rationality dojo seems to be part of a world where "we" work together for truth, at least if you don't take the dojo metaphor too seriously. I assume that training individuals to be more rational is part of your optimal strategy. So I take it that you argument is that we should emphasize individual training less relative to designing institutions which facilitate truth-finding despite our biases. Am I understanding you correctly?
Robin wrote: "Martial arts can be a good training to ensure your personal security, if you assume the worst about your tools and environment." But this does not mean that martial arts cannot also be good training if you assume a more benign environment. Environments are known to be unpredictable.
One of the most important insights a person gains from martial arts training is to understand one's limits--which relates directly to the bias of overconfidence. If martial arts training enables a person to project an honestly greater degree of self confidence, then the signaling benefit alone may merit the effort. Does rationality training confer analogous signaling benefits?
An attempt to even find Einsteins is doomed unless the number of them is large enough as a fraction of the population. (cf: Eliezer's introduction to Bayes.)
On the other hand, a purely aggregate approach is a dirty hack that somehow assumes no (irrational) individual is ever able to be a bottleneck to (aggregate) good sense. It's also fragile to societal breakdown.
It seems evident to me that what's really urgent is to "raise the tide" and have it "lift all boats". Because then, tests start working and the individual bottleneck is rational.
I predict that aggregate approaches are going to be more common in the future than waiting around for an Einstein-level intelligence to be born.
For example, Timothy Gowers recently began a project (Polymath1) to solve an open problem in combinatorics through distributed proof methods. Current opinion is that they were probably successful; unfortunately, the math is too hard for me to render judgment.
Now, it's possible that they were successful because the project attracted the notice of Terence Tao, who probably qualifies as an Einstein-level mathematician. If you look at the discussion, Tao and Gowers both dominate it. On the other hand, many of the major breakthroughs in the project didn't come from either of them directly, but from other anonymous or pseudo-anonymous comments.
The time of an Einstein or Tao is too valuable for them to do all the thinking by themselves. We agree that raising the tide is absolutely necessary for this kind of project to grow.
"How can we join together to believe truth?"
Yes!
I am being deluged here on LW by all the posts and comments. Spending so much time in front of the screen does not seem sensible or rational.
What to do?
For example, with subsidized prediction markets, we can each specialize on the topics where we contribute best, relying on market consensus on all other topics.
If no one person has a good grasp of all the material, then there will be insights that are missed. Science in our era is already dominated by dumb specialists who know everything about nothing. EY's work has been so good precisely because he took the effort to understand so many different subjects. I'll bet at long odds that a prediction market containing an expert on evo-psych, an expert on ea...
I was just about to respond by asking if you would advocate a website in the beliefs of the members are aggregated based on their reliability, then I remembered: prediction markets.
I'm guessing you're not pushing a real prediction market due to legal issues, but why not create one that uses token points instead of real money?
My first thought was slightly different: have testable predictions, as in a market, but the system treats each persons' likelihood ratios as evidence (as well as the tags for the prediction, to account for each person's area of expertise)
It seems to me that the real issue still is a supply of testable problems.
Aren't we the supposed martial rationalists of the humanity? Aren't we the ones being paid (I wish) to protect the neighborhoods from the marauding apologists? Aren't we the ones to go to the wild places and battle dragons?
I would guess that martial arts are so frequently used as a metaphor for things like rationality because their value is in the meta-skills learned by becoming good at them. Someone who becomes a competent martial artist in the modern world is:
Patient enough to practice things they're not good at. Many techniques in effective martial arts require some counter-intuitive use of body mechanics that takes non-trivial practice to get down, and involve a lot of failure before you achieve success. This is also true of a variety of other tasks.
Possessing the fi
For example, with subsidized prediction markets, we can each specialize on the topics where we contribute best, relying on market consensus on all other topics.
Why do these prediction markets have to be subsidized? In the U.S., online prediction markets are currently considered internet gambling and are hampered. Is there a reason legal, laissez-faire prediction markets couldn't take hold?
The martial arts metaphor for rationality training seems popular at this website, and most discussions here about how to believe the truth seem to assume an environmental worst case: how to figure out everything for yourself given fixed info ...
A very good point!
But I can't easily explain why it is a good point without violating the ban on mention of AI.
This observation doesn't invalidate Less Wrong. Someone still has to study these things. But the emphasis on individualism here can diminish awareness of the big picture.
I think it was just brainstorming based on Eliezer's post; he also wrote about the sanity water line, which I see your rational society approach fitting in with. Maybe a dojo is a bit extreme, but I think a zendo isn't implausible, with people working on rationality koans. Or maybe rationality group therapy, where people can express potential irrationality that they can receive non-judgemental feedback on. Grassroots bottom up approaches could work with larger top down approaches to create the rational society, or whatever word Yvain might find less taboo :)
If one goes off the notions of others without coming to conclusions for themselves they're just as blind as an evangelical christian. True insight can only come from within. That's why reason is of premium importance.
It is important to denote the difference between insight and belief, however; for insight is based off of rationality and logic whereas belief is based on primal emotions and instincts.
I should also add that I think group rationality techniques are important. We've already seen that being a good group rationalist means acting differently than just trying to be individually as accurate as possible. [in particular, you should not be swayed by what the rest of the group thinks].
Martial arts can be a good training to ensure your personal security, if you assume the worst about your tools and environment. If you expect to find yourself unarmed in a dark alley, or fighting hand to hand in a war, it makes sense. But most people do a lot better at ensuring their personal security by coordinating to live in peaceful societies and neighborhoods; they pay someone else to learn martial arts. Similarly, while "survivalists" plan and train to stay warm, dry, and fed given worst case assumptions about the world around them, most people achieve these goals by participating in a modern economy.
The martial arts metaphor for rationality training seems popular at this website, and most discussions here about how to believe the truth seem to assume an environmental worst case: how to figure out everything for yourself given fixed info and assuming the worst about other folks. In this context, a good rationality test is a publicly-visible personal test, applied to your personal beliefs when you are isolated from others' assistance and info.
I'm much more interested in how we can can join together to believe truth, and it actually seems easier to design institutions which achieve this end than to design institutions to test individual isolated general tendencies to discern truth. For example, with subsidized prediction markets, we can each specialize on the topics where we contribute best, relying on market consensus on all other topics. We don't each need to train to identify and fix each possible kind of bias; each bias can instead have specialists who look for where that bias appears and then correct it.
Perhaps martial-art-style rationality makes sense for isolated survivalist Einsteins forced by humanity's vast stunning cluelessness to single-handedly block the coming robot rampage. But for those of us who respect the opinions of enough others to want to work with them to find truth, it makes more sense to design and field institutions which give each person better incentives to update a common consensus.