Less Wrong is a community blog devoted to refining the art of human rationality. Please visit our About page for more information.

You have a set amount of "weirdness points". Spend them wisely.

57 Post author: peter_hurford 27 November 2014 09:09PM

I've heard of the concept of "weirdness points" many times before, but after a bit of searching I can't find a definitive post describing the concept, so I've decided to make one.  As a disclaimer, I don't think the evidence backing this post is all that strong and I am skeptical, but I do think it's strong enough to be worth considering, and I'm probably going to make some minor life changes based on it.

-

Chances are that if you're reading this post, you're probably a bit weird in some way.

No offense, of course.  In fact, I actually mean it as a compliment.  Weirdness is incredibly important.  If people weren't willing to deviate from society and hold weird beliefs, we wouldn't have had the important social movements that ended slavery and pushed back against racism, that created democracy, that expanded social roles for women, and that made the world a better place in numerous other ways.

Many things we take for granted now as why our current society as great were once... weird.

 

Joseph Overton theorized that policy develops through six stagesunthinkable, then radical, then acceptable, then sensible, then popular, then actual policy.  We could see this happen with many policies -- currently same-sex marriage is making its way from popular to actual policy, but not to long ago it was merely acceptable, and not too long before that it was pretty radical.

Some good ideas are currently in the radical range.  Effective altruism itself is such a collection of beliefs typical people would consider pretty radical.  Many people think donating 3% of their income is a lot, let alone the 10% demand that Giving What We Can places, or the 50%+ that some people in the community do.

And that's not all.  Others would suggest that everyone become vegetarian, advocating for open borders and/or universal basic income, theabolishment of gendered language, having more resources into mitigating existential riskfocusing on research into Friendly AIcryonicsand curing death, etc.

While many of these ideas might make the world a better place if made into policy, all of these ideas are pretty weird.

 

Weirdness, of course, is a drawback.  People take weird opinions less seriously.

The absurdity heuristic is a real bias that people -- even you -- have.  If an idea sounds weird to you, you're less likely to try and believe it,even if there's overwhelming evidence.  And social proof matters -- if less people believe something, people will be less likely to believe it.  Lastly, don't forget the halo effect -- if one part of you seems weird, the rest of you will seem weird too!

(Update: apparently this concept is, itself, already known to social psychology as idiosyncrasy credits.  Thanks, Mr. Commenter!)

...But we can use this knowledge to our advantage.  The halo effect can work in reverse -- if we're normal in many ways, our weird beliefs will seem more normal too.  If we have a notion of weirdness as a kind of currency that we have a limited supply of, we can spend it wisely, without looking like a crank.

 

All of this leads to the following actionable principles:

Recognize you only have a few "weirdness points" to spend.  Trying to convince all your friends to donate 50% of their income to MIRI, become a vegan, get a cryonics plan, and demand open borders will be met with a lot of resistance.   But -- I hypothesize -- that if you pick one of these ideas and push it, you'll have a lot more success.

Spend your weirdness points effectively.  Perhaps it's really important that people advocate for open borders.  But, perhaps, getting people to donate to developing world health would overall do more good.  In that case, I'd focus on moving donations to the developing world and leave open borders alone, even though it is really important.  You should triage your weirdness effectively the same way you would triage your donations.

Clean up and look good.  Lookism is a problem in society, and I wish people could look "weird" and still be socially acceptable.  But if you're a guy wearing a dress in public, or some punk rocker vegan advocate, recognize that you're spending your weirdness points fighting lookism, which means less weirdness points to spend promoting veganism or something else.

Advocate for more "normal" policies that are almost as good.   Of course, allocating your "weirdness points" on a few issues doesn't mean you have to stop advocating for other important issues -- just consider being less weird about it.  Perhaps universal basic income truly would be a very effective policy to help the poor in the United States.  But reforming the earned income tax credit and relaxing zoning laws would also both do a lot to help the poor in the US, and such suggestions aren't weird.

Use the foot-in-door technique and the door-in-face technique.  The foot-in-door technique involves starting with a small ask and gradually building up the ask, such as suggesting people donate a little bit effectively, and then gradually get them to take the Giving What We Can Pledge.  The door-in-face technique involves making a big ask (e.g., join Giving What We Can) and then substituting it for a smaller ask, like the Life You Can Save pledge or Try Out Giving.

Reconsider effective altruism's clustering of beliefs.  Right now, effective altruism is associated strongly with donating a lot of money and donating effectively, less strongly with impact in career choice, veganism, and existential risk.  Of course, I'm not saying that we should drop some of these memes completely.  But maybe EA should disconnect a bit more and compartmentalize -- for example, leaving AI risk to MIRI, for example, and not talk about it much, say, on 80,000 Hours.  And maybe instead of asking people to both give more AND give more effectively, we could focus more exclusively on asking people to donate what they already do more effectively.

Evaluate the above with more research.  While I think the evidence base behind this is decent, it's not great and I haven't spent that much time developing it.  I think we should look into this more with a review of the relevant literature and some careful, targeted, market research on the individual beliefs within effective altruism (how weird are they?) and how they should be connected or left disconnected.  Maybe this has already been done some?

-

Also discussed on the EA Forum and EA Facebook group.

Comments (94)

Comment author: complexmeme 29 November 2014 05:25:52AM 26 points [-]

after a bit of searching I can't find a definitive post describing the concept

The idiom used to describe that concept in social psychology is "idiosyncrasy credits", so searching for that phrase produces more relevant material (though as far as I can tell nothing on Less Wrong specifically).

Comment author: peter_hurford 29 November 2014 05:35:38PM 2 points [-]

Wow, that's amazing! Thanks!

Comment author: Salemicus 27 November 2014 11:48:15PM 21 points [-]

This post makes some great points. As G.K. Chesterton said:

A man must be orthodox upon most things, or he will never even have time to preach his own heresy.

Fundamentally, other people's attention is a scarce resource, and you have to optimise whatever use of it you can get. Dealing with someone with a large inferential gap can be exhausting and you are liable to be tuned out if you make too many different radical points.

I would also add that part of being persuasive is being persuadable. People do not want to be lectured, and will quickly pick up if you see them as just an audience to be manipulated rather than as equals.

Comment author: Metus 29 November 2014 07:29:48PM 2 points [-]

Personally, people who do plenty of things weirdly come off as trying way too hard. Depending on your environment, choose one thing that you care about and do that. I wouldn't be found dead at a formal dinner in washed out jeans and a dirty t-shirt. I would be willing to experiment with neck pieces different from formal ties.

Comment author: ete 28 November 2014 12:58:22AM 11 points [-]

I believe the effect you describe exists, but I think there are two effects which make it unclear that implementing your suggestions is an overall benefit to the average reader. Firstly, to summarize your position:

Each extra weird belief you have detracts from your ability to spread other, perhaps more important, wierd memes. Therefore normal beliefs should be preferred to some extent, even when you expect them to be less correct or less locally useful on an issue, in order to improve your overall effectiveness at spreading your most highly valued memes.

  1. If you have a cluster of beliefs which seem odd in general then you are more likely to share a "bridge" belief with someone. When you meet someone who shares at least one strange belief with you, you are much more likely to seriously consider their other beliefs because you share some common ground and are aware of their ability to find truth against social pressure. For example, an EA vegan may be vastly more able to introduce the other EA memes to a non-EA vegan than a EA non-vegan. Since almost all people have at least some weird beliefs, and those who have weird beliefs with literally no overlap with yours are likely to not be good targets for you to spread positive memes to, increasing your collection of useful and justifiable weird memes may well give you more opportunities to usefully spread the memes you consider most important

  2. Losing the absolute focus on forming an accurate map by making concessions to popularity/not standing out in too many ways seems epistemologically risky and borderline dark arts. I do agree that some situations that not advertizing all your weirdness at once may be a useful strategic choice, but am very wary of the effect putting too much focus on this could have on your actual beliefs. You don't want to strengthen your own absurdity heuristic by accident and miss out on more weird but correct and important things.

While I can imagine situations the advice given is correct (especially the for interacting with domain limited policymakers, or people you have a good read on likely reactions to extra weirdness), recommending it in general seems not sufficiently justified and I believe would have significant drawbacks.

Comment author: Raiden 28 November 2014 04:44:56AM 5 points [-]

Regarding point 2, while it would be epistemologically risky and borderline dark arts, I think the idea is more about what to emphasize and openly signal, not what to actually believe.

Comment author: ete 28 November 2014 01:30:38PM *  5 points [-]

True, perhaps I should have been more clear in my dealing with the two, and explained how I think the they can blur across unintentionally. I do think being selective with signals can be instrumentally effective, but I think it's important to be intentionally aware when you're doing that and not allow your current mask to bleed over and influence your true beliefs unduly.

Essentially I'd like this post to come with a "Do this sometimes, but be careful and mindful of the possible changes to your beliefs caused by signaling as if you have different beliefs." warning.

Comment author: Nepene 30 November 2014 04:34:24AM 2 points [-]

There is a definite likelihood that acting out a belief will cause you to believe it due to your brain poorly distinguishing signalling and true beliefs.

That can be advantageous at times. Some beliefs may be less important to you, and worthy of being sacrificed for the greater good. If you say, believe that forcing people to wear suits is immoral and that veganism is immoral then it may be worth you sacrificing your belief in the unethical nature of suits so you can better stop people eating animals.

A willingness to do this is beneficial in most people who want to join organizations. They normally have a set of arbitrary rules on social conduct, dress, who to respect and who to respect less, how to deal with sickness and weakness, what media to watch, who to escalate issues to in the event of a conflict. If you don't do this you'll find it tricky gaining much power because people can spot people who fake these things.

Comment author: Capla 09 December 2014 07:12:02PM *  2 points [-]

Some beliefs may be less important to you, and worthy of being sacrificed for the greater good. If you say, believe that forcing people to wear suits is immoral and that veganism is immoral then it may be worth you sacrificing your belief in the unethical nature of suits so you can better stop people eating animals.

No. I will make concessions about which beliefs to act on in order to optimize for "Goodness", but I'm highly concerned about sacrificing beliefs about the world themselves. Doing this may be beneficial in specific situation, but at a cost to your overall effectiveness in other situations across domains. Since the range of possible situations that you might find yourself in is infinite, there is no way to know whether you've made a change to your model with catastrophic consequences down the line. Furthermore, we evaluate the effectiveness of strategies on the basis of the model we have, so every time your model becomes less accurate, your estimate of what is the best option in a given situation becomes less accurate. (Note that your confidence in your estimate may rise, fall, or stay the same, but I would doubt that having a less accurate model is going to lead to better credence calibration)

Allowing your beliefs to change for any reason other than to better reflect the world, only serves to make you worse at knowing how best to deal with the world.

Now, changing your values - that's another story.

Comment author: Nepene 13 December 2014 02:20:32AM 1 point [-]

You can easily model beliefs and work out if they're likely to have good or bad results. They could theoretically have a variety of infinite impacts, but most probably have a fairly small and limited effect. Humans have lots of beliefs, they can't all have a major impact.

For the catastrophic consequences issue, have you read this?

http://lesswrong.com/lw/ase/schelling_fences_on_slippery_slopes/

The slippery slope issue of potentially catastrophic consequences from a model can be limited by establishing arbitrary lines before hand that you refuse to cross. Whether you should sacrifice your beliefs, like with Gandhi, depends on what the value given for said sacrifice is, how valuable your sacrifice is to your models, and what the likelihood of catastrophic failure is. You can swear an oath not to cross those lines, give valuable possessions to people to destroy if you cross those lines so you can heavily limit the chance of catastrophic failure.

Allowing your beliefs to change for any reason other than to better reflect the world, only serves to make you worse at knowing how best to deal with the world.

Yeah, your success rate drops, but your ability to socialize can rise since irrational beliefs are how many think. If your irrational beliefs are of low importance, not likely to cause major issues, and unlikely to cause catastrophic failure they could be helpful.

Comment author: Lumifer 29 November 2014 04:38:07AM 5 points [-]

It would be helpful to point out that your post is within the context of trying to convince other people, aka memetic warfare. Your "actionable principles" serve a specific goal which you do not identify.

Comment author: peter_hurford 29 November 2014 06:44:27AM *  0 points [-]

Fair point. I concede I'm only writing here in the context of public advocacy.

Comment author: 27chaos 27 November 2014 10:05:52PM 5 points [-]

Notions of weirdness vary a lot. Also, individual instances of weirdness will be visible to different people. Both these challenge the idea we should bother having aggregated measurements of weirdness at all. People's sensitivity to weirdness also varies, sometimes in complicated ways. Some people are actually more receptive to ideas that sound weird. Other people will believe if someone is both successful and weird they must know something others don't. Others are willing to ignore weirdness if allied with it. This is all very complex.

I think our social brains already do a good job of keeping track of all these important details. Trying to consciously score different traits, beliefs, and actions for an aggregated weirdness score seems like a recipe for anxiety and disaster to me. I don't think it's worth worrying about. Inauthentic strategies like this are too hard and unpleasant to sustain.

Comment author: ChristianKl 28 November 2014 04:35:41PM *  12 points [-]

Nerds are very often too shy. They are not willing to go to the extreme. Radical feminism has a lot of influence on our society and plenty of members of that community don't hold back at all.

Bending your own views to be avoid offending other people leads to being perceive as inconfident. It's not authentic. That's bad for movement building.

I think you are making a mistake if you treat the goal of every project as being about affecting public policy. Quite often you don't need a majority. It's much better to have a small group of strongly committed people then a large group that's only lukewarm committed.

Mormons who spent 2 years doing their mission are extreme. Mormonism is growing really fast while less extreme Christian groups don't grow. Groups that advocate extreme positions give their members a feeling that they are special. They are not boring.

In the scarce attention economy of the 21st century being boring is one of the worst things you can do if you want to speak to a lot of people.

Comment author: Jiro 29 November 2014 06:43:47PM 8 points [-]

Mormon missions are not primaritly there to gain converts. They are there to force the Mormon to make a commitment of time and resources to Mormonism so that the sunk costs would psychologically tie him to the religion.

(Of course, it wasn't necessarily consciously designed for this purpose, but that doesn't prevent the purpose from being served.)

Comment author: ChristianKl 29 November 2014 07:45:15PM 2 points [-]

That's part of the point. If you want strong changes in society than you need to do movement building. That means you don't focus on outsiders but on strengthing the commitment inside the movement.

Comment author: peter_hurford 29 November 2014 06:43:39AM 5 points [-]

Mormons who spent 2 years doing their mission are extreme.

Though they're only really extreme about a few things -- their Mormonism and some personal restraint (e.g., no alcohol, etc.) that serves religious purposes. They're otherwise quite normal people.

And I think religious weirdness is one of the kinds of weirdness that people see past the most easily.

I'm not saying that one shouldn't try to be extreme, but that one should (if one aims at public advocacy) try to be extreme in only a few things.

Comment author: Capla 09 December 2014 07:17:37PM 4 points [-]

personal restraint (e.g., no alcohol, etc.)

It seems borderline-literally insane to me that "personal restraint" is "extreme" and marks one as a radical.

Comment author: TheOtherDave 09 December 2014 07:31:54PM 2 points [-]

It's pretty common for groups to treat individual restraint in the context of group lack-of-restraint as a violation of group norms, though "radical" is rarely the word used. Does that seem insane to you more generally (and if so, can you say more about why)?

If not, I suspect "extreme" has multiple definitions in this discussion and would be best dropped in favor of more precise phrases.

Comment author: Capla 09 December 2014 08:39:58PM *  2 points [-]

It's pretty common for groups to treat individual restraint in the context of group lack-of-restraint as a violation of group norms, though "radical" is rarely the word used. Does that seem insane to you more generally (and if so, can you say more about why)?

Yes. That seems insane to me.

Self restraint is applied self control. It is a virtue and is something to be admired, so long as what as one is restraining one's self for some benefit, not needlessly (though personally, I have respect for all forms of restraint, even if they are needless, e.g. religiously motivated celibacy, in the same way I have respect for the courage of suicide bombers).

Is alcohol consumption restraint without benefit? No. Alcohol is a poison that limits one's faculties in small amounts, with detrimental health effects, in large doses.

A friend, was sharing with me the other day that he doesn't like the culture of...I'm not sure what to call it...knowing overindulgence? He gave the example of the half joking veneration of bacon something that everyone loves and always wants more of, as if to say "I know it's unhealthy, but that's why we love it" so much.

I hear people say, "I don't eat healthy food", and in the culture we live in, that is an acceptable thing to say, where to me it sounds like an admission that you lack self control, but instead of acknowledging it as a problem, and working on it, glancing over it with a laugh.

I am a vegetarian. I once sat down for a meal with a friend and my sister. The friend asked my sister if she was a vegetarian. My sister said she wasn't. The friend said (again, half joking), "Good", as if vegetarianism is a character flaw: real people love meat. I confronted her about it later, and said that that bothered me. I know not everyone is a vegetarian, and it is each person's own choice to weigh the costs and benefits to decide for themselves, but there are many, many good reason to practice some kind of meat-restriction, from the ecological, to the moral, to simple health. I won't tolerate my friend, acting as if not eating meat means there is something wrong with you..

It feels to me, and maybe I'm projecting, that not everyone is up for making hard choices*, but instead of owning up to that, we have built a culture that revels in overindulgence. The social pressure pushes in the wrong direction.

It's weird to not drink. It's weird to not eat meat. It's weird to put to much effort into staying healthy. It's weird to give a significant portion of your income to save lives. Those are just obviously (to me) the right thing to do.

It seems to me that the way we treat smoking is about right. Mostly, we let smokers make their own choices, and don't hold those choices against them as individuals. However, there is also a social undercurrent of, "smoking is disgusting" or at least "smoking is stupid; if you don't smoke, don't start." There is a mild social pressure for people to stop smoking, as opposed to someone getting weird looks if they turn down a cigarette (the way I do now, if I turn down a cookie).

This is a subjective, semi-rant and I'm expressing my opinion. Consider this an elaboration on the off-hand comment above, and feel free to challenge me if I'm wrong.

  • I'm self conscious about the fact that I'm implicitly saying that I'm strong enough to make those hard choices, but I'm saying it anyway.
Comment author: TheOtherDave 09 December 2014 09:20:51PM 1 point [-]

Consider this an elaboration on the off-hand comment above,

(nods) Which is exactly what I asked for; thank you.

feel free to challenge me if I'm wrong

I think you're using a non-standard definition of "insane," but not an indefensible one.

Comment author: Lumifer 09 December 2014 07:39:31PM 1 point [-]

It seems borderline-literally insane to me that "personal restraint" is "extreme" and marks one as a radical.

Depends on what kind. The one that runs counter to the prevailing social norms does mark one as a radical.

You can treat incluses as people who practice "personal restraint" :-/

Comment author: Capla 09 December 2014 08:41:57PM 3 points [-]

I think these fall under the group that I admire the way I admire the courage of suicide bombers. I admire the dedication, but I think they are insane for other reasons.

Comment author: ChristianKl 29 November 2014 04:47:48PM 0 points [-]

Mormon polygamy is not normal. Mormons donating 10% of their income also isn't normal. Mormonism has enough impact on a person that some Mormons can identify other Mormons.

And I think religious weirdness is one of the kinds of weirdness that people see past the most easily.

The thing that distinguishes religious weirdness is that it comes from a highly motivated place and isn't a random whim.

if one aims at public advocacy

I'm not exactly sure what you mean with "public advocacy".

Comment author: Adele_L 30 November 2014 12:03:06AM 6 points [-]

Mormons don't practice polygamy anymore, and they haven't for a long time (except for small 'unofficial' groups). Most Mormons I know feel pretty weird about it themselves.

Comment author: peter_hurford 29 November 2014 05:47:00PM 2 points [-]

Mormon polygamy is not normal. Mormons donating 10% of their income also isn't normal.

Good point. But, if I recall correctly, don't they go to a good amount of length to not talk about these things a lot?

-

The thing that distinguishes religious weirdness is that it comes from a highly motivated place and isn't a random whim.

I don't think it's just a highly motivated place, but rather a highly motivated place that other people can easily verify as highly motivated and relate to.

-

I'm not exactly sure what you mean with "public advocacy".

Bringing up an ingroup idea with people outside your ingroup.

For example, I'd love it if people ate less meat. So I might bring that up with people, as the topic arises, and advocate for it (i.e., tell them why I think not eating meat is better). I still envision it as a two-way discussion where I'm open to the idea of being wrong, but I'd like them to be less affected by certain biases (like weirdness) if possible.

Comment author: ChristianKl 29 November 2014 08:14:05PM *  2 points [-]

I don't think a conversation at a birthday of a friend qualifies as "public" in the traditional sense.

So I might bring that up with people, as the topic arises, and advocate for it (i.e., tell them why I think not eating meat is better).

I think that's seldom the most straightforward way for changing people through personal conversation. It makes much more sense to ask a lot of questions and target your communication at other person,

Status also matters. Sometimes doing something weird lower your status other time it raises it. It always makes sense to look at the individual situation.

Comment author: peter_hurford 29 November 2014 10:12:53PM 1 point [-]

I don't think a conversation at a birthday of a friend qualifies as "public" in the traditional sense.

What did you have in mind? I think this advice applies even more so to "public" venues in the traditional sense (e.g., blogging for general audiences).

Comment author: ilzolende 28 November 2014 01:26:23AM 4 points [-]

I think it's important to consider the varying exchange rates, as well as the possible exchange options, when choosing how to spend your weirdness points.

Real example: Like enough other people on this website to make it an even less representative sample of the general population, I'm autistic, so spending a weirdness point on being openly such is useful, not because it's the best possible way to promote disability rights, but rather because I can save a lot of willpower that I need for other tasks that way.

Fake example: The Exemplars, a band popular with the teenage set, are rationalists who want to promote a political cause. However, none of the causes they care about are standard causes for bands. As teenagers are generally more interested in the end of the world than, for example, cryonics, they decide to sing about x-risk on their East Examplestan tour. This still makes them look just about as weird as if they sang about cryonics, but teenagers are more interested in x-risk, so they get better results. (Until the Church of Driving 50 In A 30 Mile Per Hour Zone gets them convicted of blasphemy, that is.)

Comment author: Kaj_Sotala 28 November 2014 10:23:58AM 15 points [-]

I agree with the general gist of the post, but I would point out that different groups consider different things weird, and have differing opinions about what weirdness is a bad thing.

To use your "a guy wearing a dress in public" example - I do this occasionally, and gauging from the reactions I've seen so far, it seems to earn me points among the liberal, socially progressive crowd. My general opinions and values are such that this is the group that would already be the most likely to listen to me, while the people who are turned off by such a thing would be disinclined to listen to me anyway.

I would thus suggest, not trying to limit your weirdness, but rather choosing a target audience and only limiting the kind of weirdness that this group would consider freakish or negative, while being less concerned by the kind of weirdness that your target audience considers positive. Weirdness that's considered positive by your target audience may even help your case.

Comment author: Metus 29 November 2014 07:27:16PM 7 points [-]

To make this picture a bit more colourful: I love suits, they look great on me. But I will be damned if I wear suits to university for people will laugh at me and not take me seriously because to the untrained eye all suits are considered business suits. On the other hand hanging around in a coffee place at any odd time of the day is completely to normal to the same group.

Contrast this with the average person working in an environment where they wear a suit: The suit could help me signal that I am on their side, the being in a coffee place at any odd time would then become my cause to be accepted.

The lesson then is to pick the tribe you are in, as you will know their norms best and adher to them anyhow, and then a cause that will produce the most utility within that tribe. It just so happens that there is the extremely large tribe "the public" which sometimes leads people to ignore that they can influence other, really big tribes, like Europeans, British, Londoners and then the members of their boroughs, to make a divide by region.

Comment author: Philip_W 09 February 2015 12:24:15AM 3 points [-]

I think I might have been a datapoint in your assessment here, so I feel the need to share my thoughts on this. I would consider myself socially progressive and liberal, and I would hate not being included in your target audience, but for me your wearing cat ears to the CFAR workshop cost you weirdness points that you later earned back by appearing smart and sane in conversations, by acceptance by the peer group, acclimatisation, etc.

I responded positively because it fell within the 'quirky and interesting' range, but I don't think I would have taken you as seriously on subjectively weird political or social opinions. It is true that the cat ears are probably a lot less expensive for me than cultural/political out-group weirdness signals, like a military haircut. It might be a good way to buy other points, so positive overall, but that depends on the circumstances.

Comment author: Kaj_Sotala 09 February 2015 08:30:58AM 0 points [-]

Thank you! I appreciate the datapoint.

Comment author: [deleted] 28 November 2014 03:44:31PM 4 points [-]

This carries the slight problem that people tend to get offended when they realize you're explicitly catering to an audience. If I talked about the plight of the poor and meritocracy to liberals and about responsibility and family to conservatives, advocating the exact same position to each, and then each group found out about the speech I gave to the other, they would both start thinking of me as a duplicitous snake. They might start yelling about "Eli Sennesh's conspiracy to pass a basic income guarantee" or something like that: my policy would seem "eviler" for being able to be upheld from seemingly disjoint perspectives.

Comment author: Kaj_Sotala 28 November 2014 04:35:51PM 2 points [-]

Right, I wouldn't advocate having contradictory presentations, but rather choosing a target audience that best fits your personality and strengths, and then sticking to that.

Comment author: kilobug 29 November 2014 11:37:51AM 9 points [-]

Interesting post (upvoted) but I would add one "correction" : the amount of "weirdness points" isn't completely set, there ways to get more of them, especially by being famous, doing something positive or helping people. For example, by writing a very popular fanfiction (HPMOR), Eliezer earned additional weirdness points to spend.

Or on my own level, I noticed that by being efficient in my job and helpful with my workmates, I'm allowed a higher number of "weirdness points" before having my workmates start considering me as a loonie. But then you've to be very careful, because weirdness points earned within a group (say, my workmates) don't extend outside of the group.

Comment author: ike 30 November 2014 05:18:49AM 5 points [-]

For example, by writing a very popular fanfiction (HPMOR)

For anyone who hasn't read HP and thinks fantasy is weird, he lost points for that.

One way to get more points is to listen to other people's weird ideas. In fact, if someone else proposes a weird idea that you already agree with, it may be a good idea not to let on, but publicly "get convinced", to gain points. (Does that count as Dark Arts?)

Comment author: dxu 30 November 2014 05:38:08AM *  4 points [-]

I have actually thought of that, but in relation to a different problem: not that of seeming less "weird", but that of convincing someone of an unpopular idea. It seems like the best way to convince people of something is to act like you're still in the process of being convinced yourself; for instance, I don't remember where, but I do remember reading an anecdote on how someone was able to convince his girlfriend of atheism while in a genuine crisis of faith himself. Incidentally, I should emphasize that his crisis of faith was genuine at the time--but it should work even if it's not genuine, as long as the facade is convincing. I theorize that this may be due to in-group affiliation, i.e. if you're already sure of something and trying to convince me, then you're an outsider pushing an agenda, but if you yourself are unsure and are coming to me for advice, you're on "my side", etc. It's easy to become entangled in just-so stories, so obviously take all of this speculation with a generous helping of salt, but it seems at least worth a try. (I do agree, however, that this seems borderline Dark Arts, so maybe not that great of an idea, especially if you value your relationship with that person enough to care if you're found out.)

Comment author: RichardKennaway 30 November 2014 08:02:40AM 5 points [-]

I should emphasize that his crisis of faith was genuine at the time--but it should work even if it's not genuine, as long as the facade is convincing.

This is called "concern trolling".

I do agree, however, that this seems borderline Dark Arts

It isn't "borderline Dark Arts", it's straight-out lying.

It should work ... as long as the facade is convincing

This imagines the plan working, and uses that as argument for the plan working.

Comment author: dxu 30 November 2014 05:11:51PM *  3 points [-]

This is called "concern trolling".

I was not aware that it had a name; thank you for telling me.

It isn't "borderline Dark Arts", it's straight-out lying.

Agreed. The question, however, is whether or not this is sometimes justified.

This imagines the plan working, and uses that as argument for the plan working.

Well, no. It assumes that the plan doesn't fall prey to an obvious failure mode, and suggests that if it does not, it has a high likelihood of success. (The idea being that if failure mode X is avoided, then the plan should work, so we should be careful to avoid failure mode X when/if enacting the plan.)

Comment author: RichardKennaway 30 November 2014 06:55:55PM 4 points [-]

This imagines the plan working, and uses that as argument for the plan working.

Well, no. It assumes that the plan doesn't fall prey to an obvious failure mode

The failure mode (people detecting the lie) is what it would be for this plan to fail. It's like the empty sort of sports commentary that says "if our opponents don't get any more goals than us, we can't lose", or the marketing plan that amounts to "if we get just 0.001% of this huge market, we'll be rich."

See also. Lying is hard, and likely beyond the capability of anyone who has just discovered the idea "I know, why not just lie!"

Comment author: dxu 30 November 2014 07:16:21PM *  3 points [-]

That the plan would fail if the lie is detected is not under contest, I think. However, it is, in my opinion, a relatively trivial failure mode, where "trivial" is meant to be taken in the sense that it is obvious, not that it is necessarily easy to avoid. For instance, equations of the form a^n + b^n = c^n have trivial solutions in the form (a,b,c) = (0,0,0), but those are not interesting. My original statement was meant to be applied more as a disclaimer than anything else, i.e. "Well obviously this is an easy way for the plan to fail, but getting past that..." The reason for this was because there might be more intricate/subtle failure modes that I've not yet thought of, and my statement was intended more as an invitation to think of some of these less trivial failure modes than as an argument for the plan's success. This, incidentally, is why I think your analogies don't apply; the failure modes that you mention in those cases are so broad as to be considered blanket statements, which prevents the existence of more interesting failure modes. A better statement in your sports analogy, for example, might be, "Well, if our star player isn't sick, we stand a decent chance of winning," with the unstated implication being that of course there might be other complications independent of the star player being sick. (Unless, of course, you think the possibility of the lie being detected is the only failure mode, in which case I'd say you're being unrealistically optimistic.)

Also, it tends to be my experience that lies of omission are much easier to cover up than explicit lies, and the sort suggested in the original scenario seem to be closer to the former than to the latter. Any comments here?

(I also think that the main problem with lying from a moral perspective is that not just that it causes epistemic inaccuracy on the part of the person being lied to, but that it causes inaccuracies in such a way that it interferes with them instrumentally. Lying omissively about one's mental state, which is unlikely to be instrumentally important anyway, in an attempt to improve the other person's epistemic accuracy with regard to the world around them, a far more instrumentally useful task, seems like it might actually be morally justifiable.)

Comment author: Lumifer 30 November 2014 11:56:03PM *  3 points [-]

Lying also does heavy damage to one's credibility. The binary classification of other people into "honest folk" and "liars" is quite widespread in the real world. You get classified into "liars", pretty hard to get out of there.

Comment author: dxu 01 December 2014 04:58:25AM *  2 points [-]

Well, you never actually say anything untrue; you're just acting uncertain in order to have a better chance of getting through to the other person. It seems intuitively plausible that the reputational effects from that might not be as bad as the reputational effects that would come from, say, straight-out lying; I accept that this may be untrue, but if it is, I'd want to know why. Moreover, all of this is contingent upon you being found out. In a scenario like this, is that really that likely? How is the other person going to confirm your mental state?

Comment author: Lumifer 01 December 2014 07:59:52AM 1 point [-]

but if it is, I'd want to know why

YMMV, of course, but I think what matters is the intent to deceive. Once it manifests itself, the specific forms the deception takes do not matter much (though their "level" or magnitude does).

How is the other person going to confirm your mental state?

This is not a court of law, no proof required -- "it looks like" is often sufficient, if only for direct questions which will put you on the spot.

Comment author: RichardKennaway 01 December 2014 12:29:43PM -2 points [-]

Moreover, all of this is contingent upon you being found out. In a scenario like this, is that really that likely?

Yes. It is.

Comment author: ChristianKl 01 December 2014 01:17:11PM 1 point [-]

I should emphasize that his crisis of faith was genuine at the time--but it should work even if it's not genuine, as long as the facade is convincing.

Most people are not able to have the kind of strength of emotions that come with a genuine crisis of faith via conscious choice. Pretending to have them might come of as creepy even if the other person can't exactly pinpoint what's wrong.

Comment author: dxu 01 December 2014 08:57:13PM *  3 points [-]

Fair enough. Are there any subjects about which there might not be as high an emotional backlash? Cryonics, maybe? Start off acting unconvinced and then visibly think about it over a period of time, coming to accept it later on. That doesn't seem like a lot of emotion is involved; it seems entirely intellectual, and the main factor against cryonics is the "weirdness factor", so if there's someone alongside you getting convinced, it might make it easier, especially due to conformity effects.

Comment author: ChristianKl 01 December 2014 09:53:58PM 0 points [-]

The topic of cryonics is about dealing with death. There a lot of emotion involved for most people.

Comment author: dxu 02 December 2014 04:12:35AM *  2 points [-]

It's true that cryonics is about death, but I don't think that necessarily means there's "a lot of emotion involved". Most forms of rejection to cryonics that I've seen seem to be pretty intellectual, actually; there's a bunch of things like cost-benefit analysis and probability estimates going on, etc. I personally think it's likely that there is some motivated cognition going on, but I don't think it's due to heavy emotions. As I said in my earlier comment, I think that the main factor against cryonics is the fact that it seems "weird", and therefore the people who are signed up for it also seem "weird". If that's the case, then it may be to the advantage of cryonics advocates to place themselves in the "normal" category first by acting skeptical of a crankish-sounding idea, before slowly getting "convinced". Compare that approach to the usual approach: "Hey, death sucks, wanna sign up to get your head frozen so you'll have a chance at getting thawed in the future?" Comparatively speaking, I think that the "usual" approach is significantly more likely to get you landed in the "crackpot" category.

Comment author: ChristianKl 02 December 2014 12:05:14PM 0 points [-]

Most forms of rejection to cryonics that I've seen seem to be pretty intellectual, actually; there's a bunch of things like cost-benefit analysis and probability estimates going on, etc

That's really not how most people make their decisions.

Compare that approach to the usual approach: "Hey, death sucks, wanna sign up to get your head frozen so you'll have a chance at getting thawed in the future?"

There are plenty of ways to tell someone about cryonics that don't involve a direct plea for them to take action.

Comment author: dxu 04 December 2014 05:05:35PM 3 points [-]

That's really not how most people make their decisions.

Maybe it's not how most people make their decisions, but I have seen a significant number of people who do reject cryonics on a firmly intellectual basis, both online and in real life. I suppose you could argue that it's not their true rejection (in fact, it almost certainly isn't), but even so, that's evidence against heavy emotions playing a significant part in their decision process.

There are plenty of ways to tell someone about cryonics that don't involve a direct plea for them to take action.

Yes, but most of them still suffer from the "weirdness factor".

Comment author: peter_hurford 29 November 2014 05:34:22PM 0 points [-]

Or on my own level, I noticed that by being efficient in my job and helpful with my workmates, I'm allowed a higher number of "weirdness points" before having my workmates start considering me as a loonie.

Seems like another way you're taking advantage of a positive halo effect!

Comment author: Brian_Tomasik 13 December 2014 07:17:15AM *  3 points [-]

Thanks, Peter. :) I agree about appearing normal when the issue is trivial. I'm not convince about minimizing weirdness on important topics. Some counter-considerations:

  • People like Nick Bostrom seem to acquire prestige by taking on many controversial ideas at once. If Bostrom's only schtick were anthropic bias, he probably wouldn't have reached FP's top 100 thinkers.
  • Focusing on only one controversial issue may make you appear single-minded, like "Oh, that guy only cares about X and can't see that Y and Z are also important topics."
  • If you advocate many things, people can choose the one they agree with most or find easiest to do.
Comment author: dspeyer 28 November 2014 03:56:21PM 3 points [-]

Have we worked through the game theory here? It feels like negotiating with terrorists.

Comment author: Strange7 28 November 2014 04:34:16PM 3 points [-]

My objection is to the 'set amount.' What about the Bunny Ears Lawyer trope, where someone purchases additional weirdness points with a track record of outstanding competence?

Comment author: VAuroch 01 December 2014 05:56:00AM -1 points [-]

The Bunny Ears Lawyer trope is one of those ones that never shows up in real life.

Comment author: Lumifer 01 December 2014 08:16:24PM 3 points [-]

Sure they do, you just need to look in the right places. Speaking of lawyers, one place would be the tax department of a top-tier New York law firm. Another place would be sysadmins of small companies.

Comment author: dxu 01 December 2014 06:00:17AM 2 points [-]

Not to the same extent, maybe, but I was under the impression that it does occur. I'm not willing to go check right now due to memetic hazard, but doesn't TV Tropes have a page full of real life examples?

Comment author: VAuroch 01 December 2014 08:07:15PM 0 points [-]

Nope. No Real Life entry for that trope exists.

Comment author: ike 01 December 2014 08:22:05PM 2 points [-]

It appears like it used to, as it is referenced in the text, and can be found on a fork of tvtropes here.

Comment author: arundelo 02 December 2014 03:40:54PM 2 points [-]

My favorite example from that page is Paul Erdős, who spent his life couch-surfing from one mathematical collaborator to the next.

Comment author: someonewrongonthenet 29 November 2014 08:49:49PM *  6 points [-]

Clean up and look good. Lookism is a problem in society, and I wish people could look "weird" and still be socially acceptable. But if you're a guy wearing a dress in public, or some punk rocker vegan advocate, recognize that you're spending your weirdness points fighting lookism, which means less weirdness points to spend promoting veganism or something else.

Caveat - if people already know you are well liked and popular, the weirdness actually functions as counter-signalling which makes you more popular - similar to how teasing strengthens friendships. You're signalling, "look, I'm so well liked that I can afford to be weird." If you're surrounded by chattering friends, or a straight A student, or the most skilled person in your field, people see the fact that go out in your pajamas, suffer from Einstein hair, or are covered in tattoos as a sign that you are unconcerned about social status, which in turn raises your status.

This is something I learned by initially not caring due to social ineptitude, and then slowly starting to care with age...and then noticing that caring about appearance strengthened my reception among total strangers but not caring about appearance strengthened by reception among friends and strangers who plainly could see that I had tons of friends.

In another situation, if I was doing poorly in class, dressing like as slob made me look bad...but if I was an impeccable student, it came off as an eccentric genius sort of thing. Weirdness seems to just basically magnify whatever you're already seen as.

I think this applies more to neutral stuff like appearance than to deep stuff like beliefs - but it does generally hold true with weirdness in social behavior. Even with ideas though - I suspect the ideas that made Kurzweil sound cooky would make Einstein seem even more visionary.

Comment author: [deleted] 28 November 2014 03:41:22PM 2 points [-]

Weirdness is incredibly important. If people weren't willing to deviate from society and hold weird beliefs, we wouldn't have had the important social movements that ended slavery and pushed back against racism, that created democracy, that expanded social roles for women, and that made the world a better place in numerous other ways.

Well, there is that. But there's also just the fact that being weird is what makes people interesting and fun, the sort of people I want to hang out with.

While many of these ideas might make the world a better place if made into policy, all of these ideas are pretty weird.

I think we need another adjective for the category of ideas that make people uncomfortable due to being substantially morally superior to the status quo. (And I don't even do all those things or believe all those particular ideas are good ones.)

...But we can use this knowledge to our advantage. The halo effect can work in reverse -- if we're normal in many ways, our weird beliefs will seem more normal too. If we have a notion of weirdness as a kind of currency that we have a limited supply of, we can spend it wisely, without looking like a crank.

Well, when propagandizing, yes.

But maybe EA should disconnect a bit more and compartmentalize -- for example, leaving AI risk to MIRI, for example, and not talk about it much, say, on 80,000 Hours.

Yes, this is a very good example of allocating scarce weirdness points in one's propaganda.

Comment author: hawkice 28 November 2014 02:48:23AM 2 points [-]

It might be worth emphasizing the difference between persuading people and being right. The kind of people who care about weirdness points are seldom the ones contributing good new data to any question of fact, nor those posing the best reasoning for judgments of value. I appreciate the impulse to try to convince people of things, but convincing people is extremely hard. I'm not Noam Chomsky; therefore, I have other things to do aside from thinking and arguing with people. And if I have to do one of those two worse in order to save time, I choose to dump the 'convince people' stat and load up on clear thinking.

Comment author: Unknowns 28 November 2014 04:01:32AM 2 points [-]

Weirdness points are evidence of being wrong, since someone who holds positions different from everyone else on almost every point is probably willfully contrarian. So people who care about truth will also care about weirdness points; if someone is too weird (e.g. timecube), it is probably not worth your time listening to them.

Comment author: dxu 28 November 2014 06:27:11AM *  3 points [-]

I at least somewhat disagree with this. Weirdness is not a reliable measure of truth; in fact, I'd argue that it may even slightly anti-correlate with truth (but only slightly--it's not like it anti-correlates so well that you'd be able to get a good picture of reality out of it by reversing it, mind you). After all, not every change is an improvement, but every improvement is a change. Every position that seems like common sense to us nowadays was once considered "weird" or "unusual". So yeah, dismissing positions on the basis of weirdness alone doesn't seem like that great of an idea; see the absurdity heuristic for further details.

Also, people often have reasons for discrediting things outside of striving for epistemic accuracy. Good people/causes can often be cast in a bad light by anyone who doesn't like them; for instance*, RationalWiki's article on Eliezer makes him so weird-sounding as to be absolutely cringeworthy to anyone who actually knows him, and yet plenty of people might read it and be turned off by the claims, just like they're turned off from stuff like Time Cube.

*It is not my intention to start a flame-war or to cause a thread derailment by bringing up RW. I am aware that this sort of thing happens semi-frequently on LW, which is why I am stating my intentions here in advance. I would ask that anyone replying to this comment not stray too far from the main point, and in particular please do not bring up any RW vendettas. I am not a moderator, so obviously I have no power to enforce this request, but I do think that my request would prevent any derailment or hostility if acceded to. Thank you.

Comment author: hawkice 29 November 2014 12:20:02AM *  3 points [-]

I think all three of us are right and secretly all agree.

(1) that weirdness points are bayesian evidence of being wrong (surely timecube doesn't seem more accurate because no one believes it). Normal stuff is wrong quite a lot but not more wrong than guessing.

(2) weirdness points can never give you enough certainty to dismiss an issue completely. Time Cube is wrong because it is Time Cube (read: insane ramblings), not because it's unpopular. Of course we don't have a duty to research all unlikely things, but if we already are thinking about it, "it's weird" isn't a good/rational place to stop, unless you want to just do something else, like eat a banana or go to the park or something.

and, critically, (3) If you don't have evidence enough to completely swamp and replace the bayesian update from weirdness points, you really don't have enough evidence to contribute a whole lot to any search for truth. That's what I was getting at. It's also pretty unlikely that the weirdness that "weirdness points" refer to would be unknown to someone you're talking with.

Comment author: Will_Lugar 16 April 2015 01:16:07AM 1 point [-]

Ozy Frantz wrote a thoughtful response to the idea of weirdness points. Not necessarily disagreeing, but pointing out serious limitations in the idea. Peter Hurford, I think you'll appreciate their insights whether you agree or not.

https://thingofthings.wordpress.com/2015/04/14/on-weird-points/

Comment author: [deleted] 04 December 2014 02:33:55AM 1 point [-]

Weirdness is a scarce resource with respect to ourselves? Great! Does that mean that we'd benefit from cooperating such that we all take on different facets of the weirder whole, like different faces of a PR operation?

Comment author: Nornagest 04 December 2014 03:32:36AM 2 points [-]

People tend to model organizations as agents, and I expect weirdness in an org's public-facing representatives would be more salient than normality. That implies that representatives' weirdness would be taken as cumulative rather than exclusive.

So, no.

Comment author: Jiro 29 November 2014 06:46:49PM *  -2 points [-]

Use the foot-in-door technique and the door-in-face technique

Using tactics intentionally designed to appeal to people's biases is dark arts. If you try these, you completely deserve having rationalists tell you "Sorry, I've been trying to remove my biases, not encourage them. Go away until you can be more honest."

Comment author: [deleted] 03 August 2015 07:03:14AM *  0 points [-]

Lots I agree with here. I was suprised to see basic income in your clustering above. As much as I think Cuban's are the ones doing socialism wrong, and everyone doing socialism less, like Venezuala isn't socialist enough, I'm right wing and mindkilled enough to have rejected basic income using general right wing arguments and assumptions until I read the consistency of positive examples on the Wikipedia page. The straw that broke the camels back was that there is right wing support for basic income. That being said, I'm confident that I would pass ideological turing tests.

Comment author: Jiro 03 August 2015 03:07:15PM 2 points [-]

It is generally a bad idea to change your views based on a Wikipedia page. Particularly a Wikipedia page on a politically charged subject. What you see may only mean that nobody happened to stop by the page who was willing to add the negative examples.

Also, be careful that you don't read the article as saying more than it is actually saying. it says that "several people" on the right supported it. Great, at least two, and both of them from far enough in the past that "right wing" doesn't mean what it means today.

Comment author: Good_Burning_Plastic 03 August 2015 08:48:49PM 1 point [-]

It is generally a bad idea to change your views based on a Wikipedia page.

Depends on how much you knew about the topic to begin with.

Comment author: LawrenceC 03 August 2015 03:43:41PM 1 point [-]

That being said, I'm confident that I would pass ideological turing tests.

Cool! You can try taking them here: http://blacker.caltech.edu/itt/