Several weeks ago I wrote a heavily upvoted post called Don't Be Afraid of Asking Personally Important Questions on LessWrong. I thought it would only be due diligence if I tried to track users on LessWrong who have received advice on this site and it's backfired. In other words, to avoid bias in the record, we might notice what LessWrong as a community is bad at giving advice about. So, I'm seeking feedback. If you have anecdotes or data of how a plan or advice directly from LessWrong backfired, failed, or didn't lead to satisfaction, please share below. 

New to LessWrong?

New Comment
70 comments, sorted by Click to highlight new comments since: Today at 11:28 AM

Bringing up EY/LW in a positive way in unrelated online discussions got me labeled a weirdo once or twice. I recall having to leave one forum because of the hostility. I am tempted to say that this was for the best, but it could be just the sour grapes fallacy.

4FiftyTwo9y
Yeah, I've had people complain about the standard basilisk and weird AI speculation stuff. Also the association with neoreactionaries, sexists and HBD people.
0[anonymous]9y
Sometimes you get the opposite - LW seen as an SJW forum because Scott Alexander is okay with referring to his partner as ze/zir/zur in his blog and if you are not American, or over 40, or at any rate did not go to a US college in the last 15 years this comes accross as weird. I remember even on Reddit as late as 2009 the "in" "progressive" thing was to hate Bush, not to understand something about transgenderism or feminism or what, so it is a very recent thing, I would say, in mainstream circles.
4Kawoomba9y
Whatever you do, don't mention the Contract.
2Punoxysm9y
What is this?
8Kawoomba9y
Nothing. Absolutely nothing.
2Metus9y
A joke.
3buybuydandavis9y
"An infidel in our midst! Burn him! BURN!" For LW? How wacky. Harry Browne wrote a pretty good book, "How I Found Freedom in an Unfree World". A couple of themes I took away was "sell to your market", and "it pays to advertise". You want to attract people, and invest your time in people, who actually are your market, and so that your investment has a good chance of paying off over time. That forum aint it, and chasing you out over LW doesn't say flattering things about them. There are always special cases, but in general, a forum with such hostility for what you are is not a good place for you to get attached to.
1Error9y
Was the hostility because of the werido-ness, or something else, if you don't mind me asking? Seems to me that if strangeness on its own creates that sort of response, you're probably better off elsewhere...

Eliezer's writing, fiction and non-fiction tends to attract hostility, and all LWers are automatically labeled "Yudkowskians". On a somewhat related note, the idea of AGI x-risk he's been pushing for years has finally gone mainstream, yet the high-profile people who speak out about it avoid mentioning him, like he is low-status or something.

Eliezer seems to be really really bad at acquiring or maintaining status. I don't know how aware of this fault he is, since part of the problem is that he consistently communicates as if he's super high status.

Eliezer is kind of a massive dork who also has an unabashedly high opinion of himself and his ideas. So people see him as a low-status person acting as if he is high-status, which is a pattern that for whatever reason inspires hatred in people. LessWrong people don't feel this way, because to us he is a high-status person acting as if he is high-status, which is perfectly fine.

Also, one thing he does that I think works against him is how defensive he gets when facing criticism. On Reddit, he occasionally will write long rants about how he is an unfair target of hate and misrepresentation when someone brings up Roko's basilisk. Which may be true, but feeling the need to defend yourself to such an extent is very low status behavior. Just the other day I saw him post on facebook a news story which portrayed the secular solstice in a positive light with the caption "FINALLY SOME HONEST JOURNALISM!!!!!" or something like that. This is just not a good look. I wonder if he could hire an image consultant or PR person, it seems like that would be something that could make FAI more likely.

For some reason this reminds me of a scene from Game of Thrones, where one person says "knowledge is power", and the other person responds by threatening their life, and they saying "no, power is power". (Unspecific to avoid spoilers.)

The point is, some kinds of power depends on context, some don't. Generally, respecting people for their intellectual or artist skills is context-dependent. You don't get status by being good at maths among people who consider maths low status. You don't get status for writing good fan fiction among people who consider fan fiction low status. You don't get status for being able to debate rationality among people who consider rational debating low status. -- More universal sources of status are money, and ability to harm people. Because almost everyone is afraid of harm, and almost everyone needs money.

When dealing with journalists, it is useful to realize that journalists have this kind of destructive power. Dealing with a journalist is like meeting a thug in a dark street. You don't want to make him angry. If you get out alive, you should consider it a success, and not complain about small inconveniences. In a long term, if you li... (read more)

Yes, but I don't think the negative press LessWrong receives is simply because journalists are fickle creatures. I think there is something inherent to the culture that turns outsiders off.

My guess is that Eliezer, MIRI, and LWers in general are strange people who believe strange things, and yet they (we) are pretty confident that they are right and everyone else is wrong. Not only that, but they believe that the future of humanity is in their hands. So at best, they're delusional. At worst, they're right... which is absolutely terrifying.

Also, like I said, Eliezer is a big dork, who for example openly talks about reading My Little Pony fanfiction. The idea that such a goober claims to be in charge of humanity's destiny is off-putting for the same reason. I wonder if to most people, Eliezer pattern-matches better to "weird internet celebrity", kind of an Amazing Atheist figure, than to "respectable intellectual" the same way e.g. Nick Bostrom might. We can see in presidential elections that Americans don't trust someone who isn't charismatic, tall, in good shape, etc. to run the country. So, of course, the average person will not trust someone who lacks those ... (read more)

Now I feel like every group that tries to do something faces a trilemma:

1) Deny your weakness. Leads to irrationality.

2) Admit your weakness. Leads to low status, and then opposition from outsiders.

3) Deny your weakness publicly, only admit them among trusted members. Leads to cultishness.

7Kaj_Sotala9y
I wonder: it feels like with individuals, honestly and directly admitting your weakness while giving the impression that they're not anything you're trying to hide, can actually increase your status. Having weaknesses yet being comfortable with them signals that you believe you have strength that compensates for those weaknesses, plus having flaws makes you more relatable. Could that also work for groups? I guess the biggest problem would be that with groups, it's harder to present a unified front: even when a single person smoothly and honestly admits the flaw, another gets all defensive.
4Princess_Stargirl9y
I don't think this strategy works well for individuals. Though maybe we are thinking of different reference sets. To me the way to understand social interactions is to look at what politicians do. Or if one only cares about a more intelligent set of humans executives at companies. People may hate politicians/executives but they are provably good at succeeding socially. Are politicians/executives big on admitting weakness? I don't think so. They seem much more fond of either blatantly lying (and betting their supporters will defend them) or making only the weakest possible admissions of weakness/guilt ("mistakes were made"). Of course acting like a politician is usually pretty terrible for all sorts of reasons. But its probably the "playing to win" action socially.
4Lumifer9y
In real life these choices are neither exclusive nor binary. A group might well admit the weakness in internal meetings and PR-manage the exposure of that weakness to the outside without either fully denying it or doing the whole sackcloth-and-ashes bit.
2gothgirl4206669y
Great point, I didn't think of it that way.
2Viliam_Bur9y
Thanks! On the other hand, lest I prove too much, each of these ways can work: 1) Irrationality does not have to be fatal. Dilbert makes a living complaining about irrationality of companies, and yet those companies make billions of profit. 2) Open source software exposes all their bugs, and still many open-source projects are respected. (Although this may be because their exposed weakness is incomprehensible for most people, so on the social level it is as if they exposed nothing.) 3) Most organizations have people with privileged access to information, and don't expose everything to public. Most organizations have a clear boundary between a non-member and a member, between a non-manager and a manager. People don't question this, because it's business as usual. So probably the problem here is that LessWrong is not an organization, and that LessWrong is somehow not sufficiently separated from MIRI. Which feels ironical, because I am on LessWrong every day, and I mostly don't know what people in MIRI are working at now, so the separation clearly exists from my view; but it may not exist from an outsider's view, for whom simply LessWrong = Eliezer, and MIRI = Eliezer (so if Eliezer said something low status on LessWrong, it automatically means MIRI is low status). So my conclusion is that compartmentalization has an important role, and Eliezer failed to do it properly. In real life, we usually don't have much data about leaders of high-status organizations. From the outside they seem like boring people, who only do their work and that's all they ever do. (Think about what it did for Bill Clinton's career when the details of his sex life became public.) I understand the desire to be influential and to be free to expose whatever you want about yourself, but it probably doesn't work this way. By exposing too much, you limit your status. Powerful people do not enjoy freedom of speech in the same way popular bloggers do. Eliezer went the popular blogger way. Now we ne
2Lumifer9y
It certainly doesn't work that way, but I think it's not just about status. If you want to be influential (aka have power, that's different from just being high-status), you should be instrumentally rational about it, that is, evaluate whether the consequences of your actions serve your goals. In this particular case, you need to carefully manage your public persona, the image you present to the outside. This careful management is not very compatible with exposing " whatever you want about yourself". This is actually a problem in that it's a serious disincentive for good people to get involved in high-level politics. Would you want a team of smart lawyers and investigators to go over your visible life with a fine-toothed comb looking with malice for any kind of dirt they can fling at you?
0hawkice9y
So, obviously that list isn't exhaustive, because there are more ways to split interactions than public/private, but in an attempt to add meaningful new outlooks: 4) Speak about your weaknesses openly when in public, and deny them in private. Many high status individuals are much harsher, demanding, arrogant, and certain in private than in public. I think this is a result of -- when you don't know the target well -- not knowing who you will have to impress, who you have to suck up to, and who is only useful when they get you the thing you want.
0Sarunas9y
That sounds similar to a standard job interview question "What is your greatest weakness?". In that situation, perhaps a standard advice how to answer this question - emphasize how one intends to overcome that weakness and what weaknesses one has conquered in the past - is applicable here as well? Edit. Although perhaps you meant that the very act of letting outsiders to define what is and what is not a weakness leads to low status.
2Princess_Stargirl9y
It is suicidal to admit an actual serious weakness. For multiple reasons. One is that admitting a serious weakness will leave a very bad impressions that is hard to overcome. See the research that people will frequently pay more for a single intact set of objects then two sets of the same objects where one set is damaged. The other problem is that admitting an actual error is going off the social script. It either paints you as clueless or a "weirdo." This is also a very serious problem.
0jsteinhardt9y
I don't think this is right. I talk pretty publicly about whatever problems/insecurities I have, but I do so in a pretty self-confident manner. It may help that I'm visibly competent at what I do and I don't claim that it is a universally good strategy, but it works for me and helps me to stay in a fairly constant state of growth mindset which I've found to be beneficial.
0Viliam_Bur9y
In the job interview, you are explicitly given the task to describe your weakness. And you probably choose one that is relatively harmless. Something like "I am very rational, but sometimes I am underconfident". So that's different.

I remember reading an article on Overcoming Bias long ago which predicted exactly this. In general, not just about AGI. That in many areas, first people who go there are those who ignore social conventions (otherwise they wouldn't be first). But when the area becomes successful, there comes the second wave of people who are following a safe path to success. The people from the second wave usually don't credit the people from the first wave, so the public perceives this second wave as the founders.

Eliezer did say and write many things. Some of them are now perceived as low status, some as high status. The safe road to success is to repeat only the high status things, and to never mention Eliezer. (Plus do some other high status things unrelated to Eliezer.)

"Even When Contrarians Win, They Lose" http://www.overcomingbias.com/2007/09/even-when-contr.html

This is not just a plausible story – I have personally known people where similar stories have played out, and have read about others. It has happened to varying degrees with Ted Nelson, Eric Drexler, Douglas Engelbart, Doug Lenat, David Deutsch, Alfred Russel Wallace, Hugh Everett, and, yes, me.

0seez9y
Can you provide a link to the article, if you remember it?

I one time asked for advice here and the responses felt overly demeaning and presumptuous and largely ignored trying to help in favor of lambasting me for being in the situation at all. It was not a response I had been expecting and it made me feel bad and less likely to ask somewhat personal questions in the future. I don't think anyone replying was intending to cause me any harm and it wasn't a big deal in any sense of the word. But I felt disappointed with the outcome and the community.

I'm sure anyone sufficiently interested could find this out of my post history, but the details aren't particularly interesting. To a third party it probably won't seem like much at all, but at the time to me it wasn't a good feeling.

I've seen at least one or two such occurrences like this on LW. There is a very cold and, well, rational tone to the responses here. Overall, I think it's good, since there are plenty of other forums for people to go to get encouragement. But if this is your go-to forum for life advice, and you are going through something difficult and personal that you decide to share, the responses might not give you a warm fuzzy feeling.

[-][anonymous]9y130

Rationality is about winning. If your goal is to give people advice that they will accept, imbuing your message with hopefulness and cheer will assist you in that goal. If your goal is to get people to continue asking for advice, slapping a "Working as intended" on their complaints about your advice-giving technique is an abysmal failure.

Using terms that I picked up here which are not well known, or mean different things in different contexts

Also, I sometimes over pattern match arguments and concepts I've picked up on Lesswrong to other situations, which can result in trying to condescendingly explain something irrelevant.

5Smaug1239y
I do something similar. I consistently massively underestimate the inferential gaps when I'm talking about these things, and end up spending half an hour talking about tangential stuff the Sequences explain better and faster.
5MathiasZaman9y
Since I mostly communicate in Dutch when in meatspace, I find myself rarely using terms directly from Less Wrong (because good translations don't always come to mind). Of course, this isn't exactly a lifehack, since you wouldn't expect most people to move to a different language zone for a minor benefit.
0Gunnar_Zarncke9y
Same with me. Except pointing to LW or the sequences doesn't help.

Well, I've found that advice about time management of which this site has tons, is not really helpful. It is not the lack of a system to organize my efforts but a lack of persistence that has always been a bottleneck for me.

4Gunnar_Zarncke9y
I think motivation is a hard problem and there is no simple solution. As with many self-help advice trying out multiple approaches may ultimately lead to an approach that may work for you.
3passive_fist9y
I tried the pomodoro system for a bit (which I understand is somewhat popular here) but I found it to be largely useless, for myself at least. Instead just removing various distractions was far more powerful. This is corroborated by the literature; Gloria Mark's research is worth a look: http://www.ics.uci.edu/~gmark/Home_page/Welcome.html
4Cyan9y
Someone who really cared about time management wouldn't be reading this site in the first place.
2passive_fist9y
As far as internet distractions go, lesswrong is hardly the worst offender. Although more than 10 minutes a day on LW is probably too much.
2hamnox9y
Same, actually. Pomodoros have never stuck as a solution for me.
-20Unknowns9y

I probably spend too much time on LW than is warrented, but otherwise I don't have a particular story of backfired advice that comes to mind.

Agreed. The biggest way LW has backfired is eating up free time that provides a very questionable ROI. I've spent quite a bit of time procrastinating on here, and the amount of actionable advice I've put into practice is quite low.

The advice I have put into practice usually works pretty well, but that's mostly a function of me realizing that it slots in well to my existing habits/ways of thinking.

Same, though in my case it is likely that that time was mostly funging against other internet timewasters not my worktime.

Describing myself as a "rationalist" pretty much automatically makes a bad impression, no matter how much you explain afterwards that you value emotion and passion and humanity and you're totally not a Straw Vulcan or an Objectivist.

0[anonymous]9y
"Aspiring rationalist" or "" could be a less negative alternative.

I can recall one instance of bad advice on a particular subject (I don't want to be specific). In retrospect it should have been obvious that the person giving the advice lacked the experience to give it, but it's hard to judge someone's credentials over the internet.

Some of the media recommendations have been bad; of course no recommendation is perfect, but in my limited experience LW's strike rate is worse than e.g. TV Tropes (which may just be a factor of the latter containing a lot more detail and having more contributors).

9Nornagest9y
Back when I browsed TV Tropes regularly, my algorithm for using it to find media that I liked centered around skimming a lot of media pages that looked vaguely interesting and using them to get a better idea of themes and target audience, while throwing out anything that was full of creepy fanservice tropes or obviously written by a single very enthusiastic viewer. When I tried mining it for actual recommendations, they were usually bad. LW doesn't have anything that lends itself to that sort of exploratory search, but recommendations from the media threads have been somewhat reliable for me, probably thanks to a closer demographic match. Better coverage in certain topic areas, too: we seem to have a greater proportion of literary SF readers posting here, for example.
3Richard_Kennaway9y
Of the anime recommendations I've followed up on account of their claimed rationality content, I've yet to find one that repaid the effort.
0Jayson_Virissimo9y
Had better luck anywhere else?
2Richard_Kennaway9y
The rationality content was my only interest, so I haven't particularly looked for any other source of anime recommendations. However, I have seen Princess Mononoke, Howl's Moving Castle, and Spirited Away, and all I can say of them is that they were pleasant enough.
1Vulture9y
Is it possible that some of the reported "rationality content" was more like genre-savviness which is more visible to people who are very familiar with the genre in question?
3Richard_Kennaway9y
I think it was more a case of people looking at the works with the hammer of rationality in their hand and seeing lots of nails for the characters to knock in. For example, The Melancholy of Haruhi Suzumiya sets up a problem (Unehuv vf Tbq naq perngrq gur jbeyq 3 lrnef ntb ohg qbrfa'g ernyvfr vg, naq vs fur rire qbrf gura fur zvtug haperngr vg whfg nf rnfvyl), but I found that setup fading into the background as the series of DVDs that I watched went on. By the fourth in the series (the murder mystery on the island isolated by storms), it was completely absent. With Fate/Stay Night, one problem is that I was looking at ripped videos on Youtube, while the original material is a "visual novel" with branching paths, so it's possible (but unlikely) that the people who put up the videos missed all the rationality-relevant bits. I've not tried Death Note, but I suspect I'd find the same dynamic as in Haruhi Suzumiya. A hard problem is set up (how does a detective track down someone who can remotely kill anyone in the world just by knowing their name?), which makes it possible to read it as a rationality story, but unless the characters are actually being conspicuously rational beyond the usual standards of fiction, that won't be enough. I'm also not part of the anime/manga community: I watched these works without any context beyond the mentions on LessWrong and a general awareness of what anime and manga are. It's weird how the girls all look like cosplay characters. :)
3Desrtopa9y
I haven't watched the anime, but I have read the visual novel, and the anime does not have a reputation for being a very faithful adaptation. The visual novel at least does share themes that often feature in Eliezer's work, but I wouldn't call them "rationality content" as such. More in the manner of Heroic Responsibility and related concepts.
1Vulture9y
In terms of Death Note, I've read the first several volumes and can vouch that it's a fun, "cerebral" mystery/thriller, especially if you like people being ludicrously competent at each other, having conversations with multiple levels of hidden meaning, etc. Can't say there's anything super rational about it, but the aesthetic is certainly there.
3Desrtopa9y
Actually I for one gave up Death Note in frustration very early on because I couldn't help focusing on how much of the real inferential work was being done by the authors feeding the correct answers to the characters. Like when L concludes that Kira must know the victim's real name to kill him... there were so many reasons that just didn't work. Kira's apparent modus operandi was to kill criminals, there was no particular reason to suppose he would respond to a challenge to kill anyone else, so the fact that he didn't was already weak evidence regarding whether he could at all, let alone what the restrictions might be. Whether Kira knew his real name or not was just one variable switched between him and Lind L. Taylor. L could just as easily have been immune because he eats too many sweets. While smart, knowledgeable people can often extract a greater yield of inference from a limited amount of data than others, I find that far too many writers take this idea and run with it while forgetting that intelligence very often means recognizing how much you can't get out of a limited amount of data.

I can't think of anything on which LW did backfire, but some points on which LW is rather about neutral. I think it is valuable to list this neutral data points too.

I find myself arguing over the quality and content of LessWrong posts with friends, one close friend in particular. He questions the aspirations and qualifications of LW in general and some posts/authors we were discussing in particular. And I find myself at least partly agreeing with his assessments. Not because of his rhetoric or my wish for consent. Rather because it is convicing.

For conte... (read more)

0Arkanj3l9y
Any LW-concept-specific critiques applicable to everyone else?

According to my parents, certain behaviors are immoral if you can explain why you're doing them.

Overreacting to a parent listening in on your phone call or using physical coercion (not hitting me, just grabbing me and blocking my movements) when they claim good intentions? Teenage hormones.

Stating that you have a precommitment to react negatively to people who wiretap or use force on me, even when it's costly for me to do so? Morally wrong.

[Yes, I realize that the actual moral here is "Don't tell people you understand the concept of precommitments, just pretend to be an irrational actor". This isn't an example of advice being wrong, just an example of advice needing to be clarified.]

6Lumifer9y
Well, I would read the actual moral as "Parents are likely to phrase their arguments in terms of morality if it suits their purpose, even if it isn't actually their morality".
0ilzolende9y
I think we're using definitions differently here: I was using "moral" to mean "lesson for the reader based on what the main character wishes she had done". Also, parents in this instance react to events based on their stated moral system, not on their actual moral system. However, that is the sort of assumption I already make about my parents' statements about morality whenever those statements are suspiciously specific and applicable to a current argument that they would like to win.
0Lumifer9y
So was I :-)
5Arran_Stirton9y
Are you sure precommitment is a useful strategy here? Generally the use of precommitments is only worthwhile when the other actors behave in a rational manner (in the strictly economic sense), consider your precommitment credible, and are not willing to pay the cost of you following through on your precommitment. While I'm in no position to comment on how rational your parents are, it's likely that the cost of you being upset with them is a price they're willing to pay for what they may conceptualize as "keeping you safe", "good parenting" or whatever their claimed good intentions were. As a result no amount of precommitment will let you win that situation, and we all know that rationalists should win. The optimal solution is probably the one where your parents no longer feel that they should listen to your phone calls or use physical coercion in the first place. I couldn't say exactly how you go about achieving this without knowing more about your parents' intentions. However you should be able to figure out what their goal was and explain to them how they can achieve it without using force or eavesdropping on you.
0Viliam_Bur9y
I think this is a usual intuition. Seems wrong to me, but I don't know how exactly to fix it. I am similarly frustrated by moral intuitions which follow this pattern: (1) Imagine that you see a drowning person, and you are a good swimmer. Is it your moral duty to save them? Yes, it is. (2) Now imagine that you see a drowning person, but you absolutely can't swim. Is it your moral duty to try saving them? No, it isn't; you would probably just kill yourself and achieve nothing. (3) There is no urgent situation. You just have a choice between learning to swim and e.g. spending your time watching anime. Is it your moral duty to learn to swim? Uhm... no, it isn't. Why would it be? So, in other words, there are obstacles which can absolve you from a moral duty, but you don't have a moral duty to remove these obstacles. Actually, your situation seems a bit similar to this pattern. Being irrational and doing "precommitments" by instinct absolves your morally. If you become rational and good at introspection, learn game theory and understand your motives, then you supposedly have a moral duty (to avoid acting on these instinctive "precommitments" without replacing them with conscious ones). However, no one supposedly has a moral duty to become more rational and introspective. Seems like one part of the problem is skills which are not under your control in short term, but are under you control in long term (being good at swimming, being rational and introspective). Our intuition is too quick to classify them as immutable, because in the short-term scenario, they are. So these skills give you moral duties, but you get no moral rewards for developing them.
[-][anonymous]9y40

Not really advice, but I started talking about feminism here and immediately dropped in karma. The people arguing against me produced unbacked assertions contrary to my points, not doing a modicum of research. My responses took one to two hours of research.

If you care about the answer to a question and not just feeling happy because you think you're right, you should do the research on your own. I spend a lot of time arguing against atheists on /r/DebateReligion, and I have to do the research for them. (Guess what: 2,000 years of people practicing a religi... (read more)