Less Wrong is a community blog devoted to refining the art of human rationality. Please visit our About page for more information.

Comment author: Vaniver 16 January 2017 10:43:57PM *  3 points [-]

I'm the person that moved Flinter's post to drafts, suggesting that he resubmit it as a linkpost to Nash's talk and put his commentary in a comment, instead of the primary post.

It's not Nash's most significant work, and it is not the most important topic in the world. Those sorts of statements are a major contributor to why I thought the post was bad.

(In case people are wondering if I'm politically motivated, Hayek, a person who Nash describes as thinking parallel thoughts, is my favorite political thinker. This is policing post quality, not content.)

Comment author: niceguyanon 17 January 2017 10:04:45AM 0 points [-]

Is it possible to use moderation tools to hide the parent comment or move it. It doesn't even belong here and others have been nice enough to offer good feedback regardless. This is a welcome thread, and it's being derailed with bizarre behavior.

Comment author: Flinter 16 January 2017 06:15:50PM 0 points [-]

I don't think I should have done what I did to get my first two karma points. I suspect it degrades the quality of the site at a rate in which rationality can't inflate it. But I'll save my reasoning and the discussion of it ftm. I am now able to post my discussion on its own it seems, so I did it.

2x cheers.

Comment author: niceguyanon 16 January 2017 06:39:52PM 2 points [-]

I suspect it degrades the quality of the site...

Your first paragraph venting your frustration at the 2 karma rule was unnecessary, but cool you realized that.

I think this post is fine as an Open Thread or as an introduction post. I don't see why it is necessary for its own discussion. Plus it seems like you are making an article stating that you will make an article. I don't think you need to do that. Just come right out and say what you have to say.

Comment author: niceguyanon 13 January 2017 09:50:33PM 0 points [-]

I have the same question as this OP. I didn't think any of the answers were helpful enough. Basically everything I could find regarding Assange's asylum with Ecuador stems from the threat of Sweden extraditing him to the U.S., however the threat of politically motivated deportation remains regardless of what happens in Sweden; the U.K. can just as well do it.

Comment author: niceguyanon 13 January 2017 05:59:47PM 1 point [-]

I don't know what to think about Ego Depletion. When I first read about it, it felt quite intuitive and the research on it was robust. It came up everywhere I read. Then the whole replication crisis thing happened and serious doubts were cast on it. I updated towards a weaker effect.

I haven't given it much thought since, until I was recently reminded of the study about mental fatigue on parole board judges and how chances of granting parole were greatest at the beginning of the work day and right after a food break(replenish mental resources).

If Ego Depletion is weak at best then what is going on with the parole study? My current epistemic status is that the effect is real and not debunked; but the effect may not be as universal (good for predicting parole and not so good for contrived cognitive experiments).

Comment author: waveman 12 January 2017 10:46:49PM *  8 points [-]

And

Generally the better educated are more prone to irrational political opinions and political hysteria than the worse educated far from power. Why? In the field of political opinion they are more driven by fashion, a gang mentality, and the desire to pose about moral and political questions all of which exacerbate cognitive biases, encourage groupthink, and reduce accuracy. Those on average incomes are less likely to express political views to send signals; political views are much less important for signalling to one’s immediate in-group when you are on 20k a year. The former tend to see such questions in more general and abstract terms, and are more insulated from immediate worries about money. The latter tend to see such questions in more concrete and specific terms and ask ‘how does this affect me?’. The former live amid the emotional waves that ripple around powerful and tightly linked self-reinforcing networks. These waves rarely permeate the barrier around insiders and touch others.

Something for LWers to think about. Being smart can make you more susceptible to some biases.

Comment author: niceguyanon 13 January 2017 02:37:28PM *  8 points [-]

Being smart can make you more susceptible to some biases.

Agree but Dominic is making a much stronger claim in this excerpt, and I wish he would provide more evidence. It is a big claim that

  • the more educated are prone to irrational political opinions
  • average incomes are less likely to express political opinions to send signals.

These are great anecdotes but have there been any studies indicating a link between social status and willingness to express political views?

Comment author: Lumifer 20 December 2016 08:19:27PM *  0 points [-]

First, the question isn't whether nitpicking is good or bad. It is bad by definition since the word carries negative connotations (the same meaning with positive connotations would be called something like "careful and thorough detail-oriented assessment"). The question is whether nitpicking is important and I haven't seen data or convincing arguments that it is.

Second, when you write "largely composed of annoyances" and "we should not be happy with an environment that rewards writing with serious flaws, but only annoys the best writers" you implicitly assume that most comments are nitpicks. There is no reason to make such an assumption (and where does "rewarding" come from, anyway?).

You seem to be ignoring important social norms

Which important social norms are they? and of which society?

Comment author: niceguyanon 11 January 2017 09:17:29PM 0 points [-]

You have been noticeably not commenting. Care to comment why?

Comment author: gwern 09 January 2017 08:50:35PM *  21 points [-]

So apparently the fundamental attribution bias may not really exist: "The actor-observer asymmetry in attribution: a (surprising) meta-analysis", Malle 2006. Nor has Thinking, Fast and Slow held up too well under replication or evaluation (maybe half).

I am really discouraged about how the heuristics & biases literature has held up since ~2008. At this point, it seems like if it was written about in Cialdini's Influence, you can safely assume it's not real.

Comment author: niceguyanon 10 January 2017 04:44:30PM 7 points [-]

At this point, it seems like if it was written about in Cialdini's Influence, you can safely assume it's not real.

How well has the ideas presented in Cialdini's book held up? Scarcity heuristic, Physical attractiveness stereotype, and Reciprocity I thought were pretty solid and hasn't come under scrutiny, yet at least.

Comment author: chaosmage 10 January 2017 02:39:01PM 2 points [-]

That's exactly my point. The information posted here is a reformulation of exactly the type of material at Christian apologetics sites. It does not deserve to be in a place where you would encourage people to go to find truth.

Comment author: niceguyanon 10 January 2017 03:54:21PM 0 points [-]

I understand your criticism much better now.

Comment author: NatashaRostova 10 January 2017 03:56:48AM 3 points [-]

I think there are some serious issues with the methodology and instruments used to measure heuristics & biases, which they didn't fully understand even ten years ago.

Some cognitive biases are robust and well established, like the endowment effect. Then there are the weirder ones, like ego depletion. I think a fundamental challenge with biases is clever researchers first notice them by observing other humans, as well as observing the way that they think, and then they need to try and measure it formally. The endowment effect, or priming, maps pretty well to a lab. On the other hand, ego depletion is hard to measure in a lab (in any sufficiently extendable way).

I think a lot of people experience, or think they experience, something like ego depletion. Maybe it's insufficiently described, or a broad classification, or too hard to pin down. So the original researcher noticed it in their experience, and formed a contrived experiment to 'prove' it. Everyone agreed with it, not because the statistics were compelling or it was a great research design, but because they all experience, or think they experience, ego depletion.

Then someone replicates it, and it doesn't replicate, because it's really hard to measure robustly. I think ego depletion doesn't work well in a lab, or without some sort of control or intervention, but those are hard things to set up for such a broad and expansive argument. And I guess you could build a survey, but that sucks too.

In the fundamental attribution error, I think that meta analysis is great, in that it shows that these studies suck statistically. They only work if you come to them with the strong prior evidence that "Hey, this seems like something I do to other people, and in the fake examples of attribution error I can think of lots of scenarios where I have done that." Of course, our memory sucks, so that is a questionable prior, but how questionable is it? In the end I don't know if it's real, or only real for some people, or too generalized to be meaningful, or true in some situations but not others, or how other people's brains work. Probably the original thesis was too nice and tidy: Here is a bias, here is the effect size. Maybe the reality is: Here is a name for a ton of strange correlated tiny biases, which together we classify as 'fundamental attribution', but which is incredibly challenging to measure statistically over a sample population in a contrived setting, as the best information to support it seems inextricably tangled up in the recesses of our brains.

(also most heuristics and biases probably do suck, and lack of replication shows the authors were charlatans)

Comment author: niceguyanon 10 January 2017 03:42:27PM 1 point [-]

The endowment effect, or priming, maps pretty well to a lab.

Are you saying that cognitive biases like endowment effect and priming map better to lab settings therefore are less susceptible to contrived experiments to prove them like ego depletion?

I don't know whether or not these map well to a lab or not, but priming research is one of the major areas under going a replication crisis; not sure about the endowment effect.

Comment author: chaosmage 08 January 2017 11:01:16PM 4 points [-]

Since downvoting is disabled, I'll criticize you instead.

You're presenting the classic anti-cult narrative that is being repeated since the eighties and that is available on the web in thousands of places. In fact, I would not be surprised if it turned out you copied and pasted much of this. This has no obvious relevance to LessWrong and your attempt to restate this outdated narrative in LW lingo does not change that.

A few more substantial criticisms: Jonestown, your only actual example has always been the extreme exception (in modern times), the 9/11 of cults. There are a few other much smaller examples of cults violence, but most cults are very different from that and much less extreme than you describe. They are really mostly a waste of time that people stay in because of the sunk costs fallacy. Since this narrative you copied was created, the number of cults has gone down noticably and their members' average age has gone up. The ones that remain perpetuate themselves mostly by having children, rather than "brainwashing" new members, much like other religions do. And leaving is generally easy, except if you have other family members inside.

Comment author: niceguyanon 09 January 2017 03:58:05PM 4 points [-]

Is your objection really that the topic has no relevance to LW or that because the information is found in so many other places that it has no relevance?

I appreciate summaries on LW even if they are found elsewhere because it provides for comments and discussion from a very particular group whose input which I prioritize(over other internet strangers). I often do a quick search on LW for new ideas I am exposed to, to get the LW spin. Say you just discovered this forum and you decided you like how everyone aspires to be a rationalist, but you have gaps in your knowledge about cults, this article might be far more informational than what you can find on a Google search. A Google search on cults leads to lots of websites on christian apologetics, not exactly the places I would encourage people to go to find truth. The information can be found in thousands of places but the places matter– a rationality oriented forum vs a website you are not quite sure of it's motives.

View more: Next