ike comments on You have a set amount of "weirdness points". Spend them wisely. - Less Wrong

55 Post author: peter_hurford 27 November 2014 09:09PM

You are viewing a comment permalink. View the original post to see all comments and the full post content.

Comments (94)

You are viewing a single comment's thread. Show more comments above.

Comment author: ike 30 November 2014 05:18:49AM 5 points [-]

For example, by writing a very popular fanfiction (HPMOR)

For anyone who hasn't read HP and thinks fantasy is weird, he lost points for that.

One way to get more points is to listen to other people's weird ideas. In fact, if someone else proposes a weird idea that you already agree with, it may be a good idea not to let on, but publicly "get convinced", to gain points. (Does that count as Dark Arts?)

Comment author: dxu 30 November 2014 05:38:08AM *  4 points [-]

I have actually thought of that, but in relation to a different problem: not that of seeming less "weird", but that of convincing someone of an unpopular idea. It seems like the best way to convince people of something is to act like you're still in the process of being convinced yourself; for instance, I don't remember where, but I do remember reading an anecdote on how someone was able to convince his girlfriend of atheism while in a genuine crisis of faith himself. Incidentally, I should emphasize that his crisis of faith was genuine at the time--but it should work even if it's not genuine, as long as the facade is convincing. I theorize that this may be due to in-group affiliation, i.e. if you're already sure of something and trying to convince me, then you're an outsider pushing an agenda, but if you yourself are unsure and are coming to me for advice, you're on "my side", etc. It's easy to become entangled in just-so stories, so obviously take all of this speculation with a generous helping of salt, but it seems at least worth a try. (I do agree, however, that this seems borderline Dark Arts, so maybe not that great of an idea, especially if you value your relationship with that person enough to care if you're found out.)

Comment author: RichardKennaway 30 November 2014 08:02:40AM 5 points [-]

I should emphasize that his crisis of faith was genuine at the time--but it should work even if it's not genuine, as long as the facade is convincing.

This is called "concern trolling".

I do agree, however, that this seems borderline Dark Arts

It isn't "borderline Dark Arts", it's straight-out lying.

It should work ... as long as the facade is convincing

This imagines the plan working, and uses that as argument for the plan working.

Comment author: dxu 30 November 2014 05:11:51PM *  3 points [-]

This is called "concern trolling".

I was not aware that it had a name; thank you for telling me.

It isn't "borderline Dark Arts", it's straight-out lying.

Agreed. The question, however, is whether or not this is sometimes justified.

This imagines the plan working, and uses that as argument for the plan working.

Well, no. It assumes that the plan doesn't fall prey to an obvious failure mode, and suggests that if it does not, it has a high likelihood of success. (The idea being that if failure mode X is avoided, then the plan should work, so we should be careful to avoid failure mode X when/if enacting the plan.)

Comment author: RichardKennaway 30 November 2014 06:55:55PM 4 points [-]

This imagines the plan working, and uses that as argument for the plan working.

Well, no. It assumes that the plan doesn't fall prey to an obvious failure mode

The failure mode (people detecting the lie) is what it would be for this plan to fail. It's like the empty sort of sports commentary that says "if our opponents don't get any more goals than us, we can't lose", or the marketing plan that amounts to "if we get just 0.001% of this huge market, we'll be rich."

See also. Lying is hard, and likely beyond the capability of anyone who has just discovered the idea "I know, why not just lie!"

Comment author: dxu 30 November 2014 07:16:21PM *  3 points [-]

That the plan would fail if the lie is detected is not under contest, I think. However, it is, in my opinion, a relatively trivial failure mode, where "trivial" is meant to be taken in the sense that it is obvious, not that it is necessarily easy to avoid. For instance, equations of the form a^n + b^n = c^n have trivial solutions in the form (a,b,c) = (0,0,0), but those are not interesting. My original statement was meant to be applied more as a disclaimer than anything else, i.e. "Well obviously this is an easy way for the plan to fail, but getting past that..." The reason for this was because there might be more intricate/subtle failure modes that I've not yet thought of, and my statement was intended more as an invitation to think of some of these less trivial failure modes than as an argument for the plan's success. This, incidentally, is why I think your analogies don't apply; the failure modes that you mention in those cases are so broad as to be considered blanket statements, which prevents the existence of more interesting failure modes. A better statement in your sports analogy, for example, might be, "Well, if our star player isn't sick, we stand a decent chance of winning," with the unstated implication being that of course there might be other complications independent of the star player being sick. (Unless, of course, you think the possibility of the lie being detected is the only failure mode, in which case I'd say you're being unrealistically optimistic.)

Also, it tends to be my experience that lies of omission are much easier to cover up than explicit lies, and the sort suggested in the original scenario seem to be closer to the former than to the latter. Any comments here?

(I also think that the main problem with lying from a moral perspective is that not just that it causes epistemic inaccuracy on the part of the person being lied to, but that it causes inaccuracies in such a way that it interferes with them instrumentally. Lying omissively about one's mental state, which is unlikely to be instrumentally important anyway, in an attempt to improve the other person's epistemic accuracy with regard to the world around them, a far more instrumentally useful task, seems like it might actually be morally justifiable.)

Comment author: Lumifer 30 November 2014 11:56:03PM *  3 points [-]

Lying also does heavy damage to one's credibility. The binary classification of other people into "honest folk" and "liars" is quite widespread in the real world. You get classified into "liars", pretty hard to get out of there.

Comment author: dxu 01 December 2014 04:58:25AM *  2 points [-]

Well, you never actually say anything untrue; you're just acting uncertain in order to have a better chance of getting through to the other person. It seems intuitively plausible that the reputational effects from that might not be as bad as the reputational effects that would come from, say, straight-out lying; I accept that this may be untrue, but if it is, I'd want to know why. Moreover, all of this is contingent upon you being found out. In a scenario like this, is that really that likely? How is the other person going to confirm your mental state?

Comment author: Lumifer 01 December 2014 07:59:52AM 1 point [-]

but if it is, I'd want to know why

YMMV, of course, but I think what matters is the intent to deceive. Once it manifests itself, the specific forms the deception takes do not matter much (though their "level" or magnitude does).

How is the other person going to confirm your mental state?

This is not a court of law, no proof required -- "it looks like" is often sufficient, if only for direct questions which will put you on the spot.

Comment author: dxu 01 December 2014 09:04:48PM *  2 points [-]

This is not a court of law, no proof required -- "it looks like" is often sufficient, if only for direct questions which will put you on the spot.

Well, yes, but are they really going to jump right to "it looks like" without any prior evidence? That seems like major privileging the hypothesis. I mean, if you weren't already primed by this conversation, would you automatically think "They might be lying about being unconvinced" if someone starts saying something skeptical about, say, cryonics? The only way I could see that happening is if the other person lets something slip, and when the topic in question is your own mental state, it doesn't sound too hard to keep the fact that you already believe something concealed. It's just like passing the Ideological Turing Test, in a way.

Comment author: RichardKennaway 01 December 2014 12:29:43PM -2 points [-]

Moreover, all of this is contingent upon you being found out. In a scenario like this, is that really that likely?

Yes. It is.

Comment author: dxu 01 December 2014 08:58:38PM 3 points [-]

Yes. It is.

That's not very helpful, though. Could you go into specifics?

Comment author: milindsmart 09 January 2015 06:35:28PM 0 points [-]

It sounds like you're implying that most lies are easily found, and consequently, most unchallenged statements are truths.

That's, really really really stretching my capacity to believe. Either you're unique with this ability, or you're also committing the typical mind fallacy, w.r.t thinking all people are only as good at lying (at max) as you are at sniffing them out.

Comment author: ChristianKl 01 December 2014 01:17:11PM 1 point [-]

I should emphasize that his crisis of faith was genuine at the time--but it should work even if it's not genuine, as long as the facade is convincing.

Most people are not able to have the kind of strength of emotions that come with a genuine crisis of faith via conscious choice. Pretending to have them might come of as creepy even if the other person can't exactly pinpoint what's wrong.

Comment author: dxu 01 December 2014 08:57:13PM *  3 points [-]

Fair enough. Are there any subjects about which there might not be as high an emotional backlash? Cryonics, maybe? Start off acting unconvinced and then visibly think about it over a period of time, coming to accept it later on. That doesn't seem like a lot of emotion is involved; it seems entirely intellectual, and the main factor against cryonics is the "weirdness factor", so if there's someone alongside you getting convinced, it might make it easier, especially due to conformity effects.

Comment author: ChristianKl 01 December 2014 09:53:58PM 0 points [-]

The topic of cryonics is about dealing with death. There a lot of emotion involved for most people.

Comment author: dxu 02 December 2014 04:12:35AM *  2 points [-]

It's true that cryonics is about death, but I don't think that necessarily means there's "a lot of emotion involved". Most forms of rejection to cryonics that I've seen seem to be pretty intellectual, actually; there's a bunch of things like cost-benefit analysis and probability estimates going on, etc. I personally think it's likely that there is some motivated cognition going on, but I don't think it's due to heavy emotions. As I said in my earlier comment, I think that the main factor against cryonics is the fact that it seems "weird", and therefore the people who are signed up for it also seem "weird". If that's the case, then it may be to the advantage of cryonics advocates to place themselves in the "normal" category first by acting skeptical of a crankish-sounding idea, before slowly getting "convinced". Compare that approach to the usual approach: "Hey, death sucks, wanna sign up to get your head frozen so you'll have a chance at getting thawed in the future?" Comparatively speaking, I think that the "usual" approach is significantly more likely to get you landed in the "crackpot" category.

Comment author: ChristianKl 02 December 2014 12:05:14PM 0 points [-]

Most forms of rejection to cryonics that I've seen seem to be pretty intellectual, actually; there's a bunch of things like cost-benefit analysis and probability estimates going on, etc

That's really not how most people make their decisions.

Compare that approach to the usual approach: "Hey, death sucks, wanna sign up to get your head frozen so you'll have a chance at getting thawed in the future?"

There are plenty of ways to tell someone about cryonics that don't involve a direct plea for them to take action.

Comment author: dxu 04 December 2014 05:05:35PM 3 points [-]

That's really not how most people make their decisions.

Maybe it's not how most people make their decisions, but I have seen a significant number of people who do reject cryonics on a firmly intellectual basis, both online and in real life. I suppose you could argue that it's not their true rejection (in fact, it almost certainly isn't), but even so, that's evidence against heavy emotions playing a significant part in their decision process.

There are plenty of ways to tell someone about cryonics that don't involve a direct plea for them to take action.

Yes, but most of them still suffer from the "weirdness factor".