A social norm against unjustified opinions?
A currently existing social norm basically says that everyone has the right to an opinion on anything, no matter how little they happen to know about the subject.
But what if we had a social norm saying that by default, people do not have the right to an opinion on anything? To earn such a right, they ought to have familiarized themselves on the topic. The familiarization wouldn't necessarily have to be anything very deep, but on the topic of e.g. controversial political issues, they'd have to have read at least a few books' worth of material discussing the question (preferrably material from both sides of the political fence). In scientific questions where one needed more advanced knowledge, you ought to at least have studied the field somewhat. Extensive personal experience on a subject would also be a way to become qualified, even if you hadn't studied the issue academically.
The purpose of this would be to enforce epistemic hygiene. Conversations on things such as public policy are frequently overwhelmed by loud declarations of opinion from people who, quite honestly, don't know anything on the subject they have a strong opinion on. If we had in place a social norm demanding an adequate amount of background knowledge on the topic before anyone voiced an opinion they expected to be taken seriously, the signal/noise ratio might be somewhat improved. This kind of a social norm does seem to already be somewhat in place in many scientific communities, but it'd do good to spread it to the general public.
At the same time, there are several caveats. As I am myself a strong advocate on freedom of speech, I find it important to note that this must remain a *social* norm, not a government-advocated one or anything that is in any way codified into law. Also, the standards must not be set *too* high - even amateurs should be able to engage in the conversation, provided that they know at least the basics. Likewise, one must be careful that the principle isn't abused, with "you don't have a right to have an opinion on this" being a generic argument used to dismiss any opposing claims.
Loading…
Subscribe to RSS Feed
= f037147d6e6c911a85753b9abdedda8d)
Comments (158)
Yes, yes, it would be nice.
-1 Porn. The article is nothing but "what if". No suggestion on how do we bring about the goal or even a question to the audience. I didn't get any bit smarter upon reading it.
+1, Promoted useful comments. Sometimes a push to get us started in a direction is a good thing, even if the poster doesn't actually have anything new to add.
I think this problem, if it exists, is aided by the often misguided fact/opinion distinction commonly taught in schools. It's surely useful to some extent in the context of journalism or law, but it leads to all sorts of problems in fields like ethics and science. If 'everbody's entitled to an opinion' and there are bright lines between fact and opinion, then people will feel indignant for their opinions being declared false.
Example:
"Abby: I think it's okay to set cats on fire"
"Bob: But that's clearly wrong; (insert argument here)"
"Abby: How can it be wrong; it's just my opinion!"
Clearly your doctor's opinion on how to treat your cancer is more valuable than your barber's. And the geologist's opinion on whether Europe is getting farther from North America is better than the chemist's.
Actually, I have a workaround for the kind of situation you used as your example:
Abby: I think it's okay to set cats on fire
Me: You have a very strange definition of 'okay'. (Accompanied by appropriate social signaling that it's not the good kind of strange - depending on the person I may just say outright that I don't approve of it.)
Abby's statement is actually true, assuming she's not joking or being sarcastic or something odd like that. What she thinks is not correct, if she's using a definition of 'okay' that this society would consider normal, but that's not a very good assumption to make in situations like this, and pointing out the incongruity will get farther than trying to argue that a true statement is false.
I think the causality has to run: X-Rationalists raise the standards for ordinary rationalists and scientists-> People connected to the scientists raise their standards-> Everyone else
Sort of top down by osmosis rather than decree. Everyone gets slightly better, but most ordinary people won't have to unrealistically become X-Rationalists.
People are entitled to make mistakes, provided they are not overly detrimental to others. What is offensive is not having a mistaken opinion (particularly when this is a freshly formed mistaken impression rather than an entrenched bias), but attempting to spread it far and wide.
Existing safeguards against this include our concept of expertise. More people will listen to someone who has advanced understanding of an area of knowledge, versus a novice. Usually an expert in a subject really can provide better guidance to form valid opinions.
The trouble arises when you have an expert in an area that is not rationally mappable to reality, e.g. the Bible or religious experience. A preacher can claim expertise on the issue of origins because of biblical knowledge and claims of personal religious experience, without needing to prove that he knows enough biology or basic science to criticize the theory of evolution.
So perhaps we need a norm that criticizes use of authority in one area to make claims in an unrelated area. A preacher's opinion carries little weight in biology, just as biologists do not typically do much to define religious rhetoric.
But that would also mean that nobody but an authority in the religion could criticize the religion.
These rules always have to be symmetrical.
It's possible to be an authority on religion without being an authority in the religion, in much someone can be an authority on computers without being one.
See Expert At Versus Expert On by Robin Hanson.
"Everyone has a right to his/her opinion" is a social standard because it helps people get along. The total solution to this problem is not telling people that they aren't entitled to their opinions, so much as making it so that when you someone's opinion is ill-founded, they say, "Hey, you're right. I should change my opinion!"
Given that, in reality, that almost never happens, the expression exists as a way of maintaining civility. That way the person pointing out that no, the earth is not 6000 years old becomes the "bad guy" if he keeps pushing the point after this defense is invoked. Not good for truth, but good for short-term social stability.
On the note of what an opinion is, the expression is totally accurate with respect to matters of taste, e.g. "Chocolate is the best ice cream flavor." But with fact-entangled should statements, e.g "The federal government should raise taxes on the top income bracket," the speaker is actually (usually) expressing a factual claim, such as this being an effective way to reduce the budget deficit. To the extent that these are factual claims, you are not entitled to them.
How useful are debates in general for changing the opinions of the person you are debating with? Most debates are implicitly or explicitly framed as contests with opponents, a zero-sum game. The right thing might be to focus more on the 3rd party onlookers, some consequences might be:
*Seek bigger debates (more viewers)
*Feel less sad when your opponent "beats" you, using twisted logic.
*Present more and different types of arguments.
*Do wrestle pigs, if the debate is entertaining and public.
*Focus more on getting new info in a debate, then isolate yourself to perform belief updates when you are not in contest mode anymore.
*If the last point applies equally to your opponent, stop before they get annoyed with you. Allow them to perform calm private reasoning later instead.
Can someone think of others?
I think a norm is likely to be a product of the solution, not the solution itself.
So the problem is we have a lot of people who don't appreciate what constitutes a reasonable foundation for an opinion. They think they can just say what they feel. To put it one way, they have a poor understanding of the nature of evidence.
I don't think a norm like you describe could have any effect on anyone like that who had a poor understanding of evidence. Those people would just think the norm was wrong or ridiculous.
If they were to come to better understand the nature of evidence, they would be more receptive to the norm. But if they were to undestand evidence better then simply from this fact you'd get the desired result of people not mouthing off as much with "ignorant opinions".
So it the solution has to involve getting people to better understand the nature of evidence (or however you want to describe what is missing from their mental toolkit).
If you were to get enough people to understand the nature of evidence, that could lead to the creation of such a norm. I doubt it could happen the other way around.
Caveat: I'm not 100% confident the above story is true, but I think there's at least and element of truth in it.
People should be able to consider what they feel, it's valid rational evidence, it may just not be the best that can be done in a given situation, when better evidence is available.
I agree, I probably just didn't explain myself very well. I was just trying to talk about the situations when people express an opinion without really giving any consideration to why they think it is true.
I would say people should even explain why they think something is true, which would of course force them to consider it. And then, of course, those who disagree can and should explain in detail what they think is wrong.
Do you mean something like:
If being merely informed becomes the norm before rational reasoning is a norm, you just end up with the case of more informed political subjects becoming more polarized and more certain of their views. Badly calibrated and worse off than when they started.
Unfortunately that'd skew things towards the status quo.
Advances in knowledge often come from taking a very different angle on a problem, by someone who isn't immersed -- and thus not necessarily knowledgeable in -- the existing viewpoints about the problem (e.g. by amateurs or people from different fields).
Ultimately a person's view should be judged just on its own merits.
Actually, counting personal experience as relevant research would counteract this effect in most situations.
I'd like to hear a workable way of making that happen, though. In my experience, in any controversial situation, admitting to basing an opinion on personal experience just opens yourself up to personal attacks.
IAWY, but
It's easy to overestimate the size of this effect.
I get what you're saying, but I don't think that's quite the problem.
The real problem is the social norm that says "you aren't allowed to be critical of someone else's view because everyone has the right to an opinion".
The italicized bit is the problem. I think everyone should have the right to an opinion, but also that everyone should have the right to be able to express criticisms of other's opinions.
(I think the "you can't criticise other's views" thing stems from relativism).
I think the way we conduct debating has become stuck in a bad place.
In a debate we want to win quickly, all else equal. But all else is not equal. If you try specifically to be nice, and complement the person on the things they do get right, they have an easier time accepting criticism. In any other social situation than a purely factual debate, would you even think of only being adversarial?
This general climate is the aggregate consequence of every debate we have.
If the approach is: "Everything about you sucks, now CHANGE!" The reception will not be: "Okay, I will change X and Y, but not Z" but: "My opinions shall be immune to criticism"
The internet has enabled this polarization, by making the rationalist crowd (rightfully) more fundamentalist about their epistemic skill.
When you see that logic and evidence works to clear up so much confusion and falsity in your beliefs, you think that you can cure the "sick" person of all his diseases in one fell swoop.
Thinking of the dilemma as one of opposing "rights" also doesn't help: [My right to criticize your beliefs] vs [Your right to have them not be criticized]
When they refuse to listen to your criticism you feel angry about your rights not being respected, rather than sad that you cannot help them towards better beliefs.
Disclaimer: The "You" in this comment is the "We as rationalists"
put another way, I think the problem is a norm that says "the right to have an opinion means the right to not have it criticised"
I think this is a good distinction, and anyone somehow trying to shift social norms (perhaps within a subcommunity) might be well-advised to shift the norms in order: First, teach people that others have a right to criticize their opinion; then, teach them that they have no right to an opinion.
"teach them that they have no right to an opinion."
I know people throw the term around (I try not to), but this is maybe the most fascist thing I've seen on this board. They have no right to an opinion? You might want to rephrase this, as many of my opinions are somewhat involuntary.
http://www.overcomingbias.com/2006/12/you_are_never_e.html
That article is entitled "You Are Never Entitled to Your Opinion" and says:
I don't think Robin really means that people aren't entitled to their opinions. I think what he really means is people aren't allowed to say "I'm entitled to my opinion" - that is, to use that phrase as a defense.
There's a big difference. When people use that defense they don't really mean "I'm entitled to have an opinion", but instead "I'm entitled to express my opinion without having it criticised".
In other words "I'm entitled to my opinion" is really a code for "all opinions are equally valid and thus can't be criticised".
That said, I do think it is valid to say "I am entitled to an opinion" in situations where your right to expression is being attacked.
I'm not saying you always do have a right to freely and fully express yourself. But in situations when you do have some measure of this, it can be unfairly stomped on.
For example, you might be in a business meeting where you should be able to have input on a matter but one person keeps cutting you off.
Or say you're with friends and you're outlining your view on some topic and, though you're able to get your view out there, someone else always responds with personal attacks.
Sometimes people are just trying to shut you down.
I don't see how "I'm entitled to my opinion" is a particularly optimal or meaningful response to these situations. What about "it's unfair not to give me a chance to express my position" in the former situation, and "concluding I'm an asshole because I'm pro-X isn't justified" in the latter?
Right, "opinion" is so overloaded with meaning that in order to determine if the use of "I'm entitle to my opinion" or "You are not entitled to your opinion" is virtuous, one should taboo "opinion", and probably "entitled" as well, and express the thought in way that is specific to the situation, such as in your examples. And of course, having gone through the mental exercise of validating that what you say makes sense, you should give everyone else the benifet of this thought process and actually communicate the alternate form, so they also can tell if it is virtuous.
It seems that in this article, Robin is co-defining "opinion" with "belief". This isn't, exactly, incorrect, but I don't think it maps completely onto the common use, which may be causing misunderstanding. If I say "it's my opinion that [insert factual proposition here]", then Robin's remarks certainly apply. But if it's my opinion that chocolate chip cookie dough ice cream is delicious - which is certainly a way people often use the word "opinion" - then in what way might I not be entitled to that? Unless I turn out to be mistaken in my use of the term "chocolate chip cookie dough ice cream", or something, but assume I'm not.
Robin was clear about what he meant by "opinion". From his first paragraph, with emphasis added:
Though I agree that it can cause problems to use "opinion" in an unusual way, even in the context of explicitly stating one's unusual definition, when people are going to quote the conclusion as a slogan out of the clarifying context.
On the other hand, "You are entitled to your utility function but not your epistemology" would not make an effective slogan. (Well maybe, if it has enough "secret knowledge" appeal to motivate people to figure out what it means.)
Thank you. An opinion is a thought. What does it mean to say that you are not entitled to a thought?
In this case, it means that you're not entitled to refuse to change a belief that's been proven wrong.
If you think "everyone likes chocolate ice cream", and I introduce you to my hypothetical friend Bill who doesn't like chocolate ice cream, you're not entitled to still believe that 'everyone' likes chocolate ice cream. You could still believe that 'most people' like chocolate ice cream, but if I was able to come up with a competent survey showing that 51% of people do not like chocolate ice cream, you wouldn't be entitled to that belief, either, unless you could point me to an even more definitive study that agreed with you.
Even the belief "I like chocolate ice cream" could be proven false in some situations - peoples' tastes do change over time, and you could try it one summer and discover that you just don't enjoy it any more.
It also implies that you're supposed to go looking for proof of your claims before you make them - that you're not 'entitled' to have or spread an opinion, but instead must earn the right by doing or referencing research.
(And I agree with the two posters in the other comment-branches who pointed out that it's a poor wording.)
Agreed, absolutely. I have nothing against hearing about people's half-baked theories - something about the theory or their logic may turn out to be useful, or give me an idea about something else, even if the theory is wrong. But it'd be nice to be able to ask "so why do you think that?" without risking an unpleasant reaction. It might even lead me to figure out that some idea that I would have otherwise dismissed is actually correct!
Most people don't derive their conclusions from reasons. They establish conclusions, then go searching for 'reasons' to cite.
Asking for the reasons for the conclusion, in a way that indicates the conclusion ought to follow from them, is perceived by most people as an attack.
The only way not to risk receiving an unpleasant reaction is to avoid talking to such people.
Yes, but maybe if there was a social norm such that if I asked that and they couldn't answer, they would take the social-status hit, instead of me, they wouldn't act that way.
Social pressure is pretty much the only thing that can force normal people to acknowledge failures of rationality, in my experience. In a milieu in which a rationalization of that failure will be accepted or even merely tolerated, they'll short-circuit directly to explaining the failure away rather than forcing themselves to acknowledge the problem.
Yeah, it'd be nice, but it's probably not going to happen.
Yes, I was giving people too much credit again, wasn't I?
It took me years to even recognize that I was doing that, and I still haven't managed to stop completely.
One obstacle: as long as they aren't expected to produce obvious results to meet your expectations, people really, really like being given too much credit. And they really, really dislike being given precisely enough credit when they're nothing special, even if it lets them off the hook.
Many of my social 'problems' began once I recognized that other people didn't think like I did, and were usually profoundly stupid. That's not a recognition that lends itself to frictionless interaction with others.
This little tidbit highlights so much of what's wrong with this community:
"Many of my social 'problems' began once I recognized that other people didn't think like I did, and were usually profoundly stupid. That's not a recognition that lends itself to frictionless interaction with others."
You'd think a specimen of your gargantuan brainpower would have the social intelligence to handily conceal your disdain for the commonfolk. Perhaps it's some sort of signaling?
Or perhaps simply the recognition that it's sometimes impossible to fluff other people's egos and drive discussion along rational paths at the same time.
If people become offended when you point out weaknesses in their arguments - if they become offended if you even examine them and don't automatically treat their ideas as inherently beyond reproach - there's no way to avoid offending them while also acting rationally. It becomes necessary to choose.
I agree here: Reading stuff like this totally makes me cringe. I don't know why people of above average intelligence want to make everyone else feel like useless proles, but it seems pretty rampant. Some humility is probably a blessing here, I mean, as frustrating as it is to deal with the 'profoundly stupid', at least you yourself aren't profoundly stupid.
Of course, they probably think given the same start the 'profoundly stupid' person was given, they would have made the best of it and would be just as much of a genius as they are currently.
It's a difficult realization, when you become aware you're more intelligent then average, to be dropped into the pool with a lot of other smart people and realize you really aren't that special. I mean, in a world of some six billion odd, if you are a one-in-a-million genius, that still means you likely aren't in the top hundred smartest people in the world and probably not in the top thousand. It kind of reminds me of grad school stories I've read, with kids who think they are going to be a total gift to their chosen subject ending up extremely cynical and disappointed.
I think people online like to exaggerate their eccentricity and disregard for societal norms in an effort to appeal to the stereotypes for geniuses. I've met a few real geniuses IRL and I know you can be a genius without being horribly dysfunctional.
I think you're underestimating the degree of social intelligence required. To pull that off while still keeping the rationalistic habits that such people find offensive, you'd have to:
You'd also probably have to at least to some degree integrate the idea that it's 'okay' (not correct, just acceptable) to be irrational into your general thought process, to avoid unintentional signaling that you think poorly of them. If anything, irrational people are more likely to notice such subtle signals, since so much of their communication is based on them.
No kidding.
My sanity-saver ... but obviously not rationality-saver... has been to learn to encourage the people I'm dealing with to be more rational, at least when dealing with me. My inner circle of friends is made up almost entirely of people who ask themselves and each other that kind of question just as a matter of course, now, and dissect the answers to make sure they're correct and rational and well-integrated with the other things we know about each other.
That doesn't help at all when I'm trying to think about society in general, though.
And worse, they can cite completely incoherent "reasons", which can be observed by noting that the sequence resulting from repeated application of "what do you mean by X" basically diverges. It reminds me of the value "bottom" in a lifted type system. It denotes an informationless "result", such as that of a non-terminating computation.
I don't think you could rely on people having read a bit about the matter.
The reasons we do or don't believe something aren't so simple.
Behind my belief that quantum physics is a respectable field is not some specific evidence I have that its descriptions of the world are actually true.
My belief in it is derived from of a complex 'web of trust' - I place a certain amount of trust in the channels from which I hear that there is evidence for it, because I trust the social and scientific means by which that evidence is gathered, disseminated and evaluated. And so on.
Also because quantum mechanics is the result of a mechanism -- "scientific method" -- that I trust to generally provide reasonable results.
In general it would be great if people were more knowledgeable, our conversations were more enlightened etc. etc. - no-one will disagree with that.
But I think there is a dangerous elitist fallacy lurking here somewhere. The validity of your opinion about subject X isn't always strongly dependent on your knowledge of subject X. Let me make up a simple example. We have a choice between getting a red rock or a green rock. Some of us have a utility function that prefers red to green. Others have a utility function that prefers green to red. But a further group derives utility from complex molecular properties of rocks. Some of them have put forward elaborate arguments for why the red rock would satisfy that utility function better. Others have put forward elaborate arguments for the green rock. By contrast, the red-utility people and the green-utility people haven't put forward any elaborate arguments. In fact they haven't studied the issue much at all as it's extremely straightforward to them. This irritates the teams with the elaborate arguments and they feel the first two groups aren't really entitled to an opinion on which rock to pick. Yet, it seems to me that they are perfectly rational in how they approach the issue and have as much right to an opinion as the sophisticates.
And I don't think this is a particularly contrived example - I see a similar thing in the debate on whether my European home country should use the euro as a currency or not. Some people oppose European integration on principle (they're in favor of local government or they're nationalists or whatever you want to call it) while others support integration on principle (they're one-worlders or internationalists or whatever you want to call it). But then there are elaborate economic arguments on whether using the euro is of economic benefit to the country. The sophisticates really do tend to think that the people who haven't immersed themselves in this economic debate aren't entitled to an opinion on whether the euro should be used. I think that's wrong.
I don't see how it makes sense to reference nationalism or internationalism as terminal values. They are instrumental values, and their effects on a population's happiness/fun/quality of life should be studied and discussed. There are other valid perspectives besides economics for evaluating the issue, such as the sense of community a system promotes, but when people state simply that they value a certain system as a terminal value, their only contribution is the anecdotal data about their own preference.
I think the analogy breaks down in that the utility of economic systems can be reduced to the utility of properties about people (the actual entities with values that could be represented by utility functions), which makes more sense than reducing the utility of reflected colors to the utility of the molecular structures that reflect those colors, rather than the utility of the effect of people perceiving those colors. (Of course, even the rock example can have problems, if the molecular structures have bigger impacts on people than perception of color, for example, if green rocks are toxic.)
I don't exactly disagree but you're upping the subtlety a bit. You're arguing that people should not regard certain things as terminal values when making political decisions, things which I think in fact they do regard that way. But if I get to dress my sockpuppets up a bit I can have them say, "Fine, I agree that I'll be a good utilitarian and try to maximize population happiness over time - but I think the long-term benefits of nationalism/integrationism clearly outweigh the marginal effects of short-term economic developments." Then someone could argue against that point - and we would almost certainly have improved the political discourse. But the people with the original elaborate analyses of short-term economic effects would still have been just as wrong in their claimed superiority.
In any case, the general point I wanted to make can certainly be made in a formally correct way as long as you don't insist that every rational intelligent being must have the same utility function.
Certainly, when there is a genuine disagreement in utility functions, it is not reasonable to claim one's own terminal values as privileged (in debate, that is; it is reasonable to personally act on one's own values), but this does not apply when one's stated values are not their terminal values. And it is reasonable to question whether this is so, to try to distinguish the two cases.
The way you build such a norm is by annoyingly pointing out when people violate your proposed norm.
http://en.wikipedia.org/wiki/Consciousness_raising
Focusing on the listeners, the important thing is for the arguments/opinions to be helpful. Unjustified opinions is one example of unhelpful declaration: there is no point in listening to them. A justified opinion can also be unhelpful, if it isn't understood, or the reason it's justified isn't understood. Or even worse, a justified opinion, or valid knowledge, can have a negative effect on the listener.
Also, a person still needs to hold unjustified opinions on every subject, that's how the decisions are bootstrapped, but voicing these opinions is usually of no use to everyone else.
Often, someone who presents an incorrect or poorly supported argument can learn from their mistake and sometimes even fix the argument if they are asked for clarification.
The solution to information being harmful out of context is not to withhold the information, but to provide the context. Teach people about biases, and that they need to inspect arguments they like for biases as well as arguments they don't like.
I don't understand the point you are making here, or the relevance of the link. What do you mean by "how the decisions are bootstrapped"? Perhaps an example would help illustrate what you are talking about.
Withholding the information is also a solution. It you can construct a better one for a given situation is a separate issue.
I'm talking about priors, or what passes for them at the first step of plausibility elicitation, when you consult your gut feeling on a single question of fact. Even when you decide to seek out the additional info on a decision, you need to start from sufficient expectation in the discovered information improving your decisions. Maybe you are already convinced that Astrology is bunk, and don't need to research the Encyclopedia of Astrology in Twelve Volumes to improve the precision of your conclusion. The decisions like this are done often and without conscious notice, in fact they may determine what does receive conscious attention.
I agree that the end result would be valuable, but I think that changing norms for a whole society would be very hard.
Although it might still be easier than the converse of raising the rationality of a whole society: being informed has higher status in society than being rational. It is more related to being a professor, journalist or talking head, whereas rationality is more associated to being a nerd, scientist or economist.