Less Wrong is a community blog devoted to refining the art of human rationality. Please visit our About page for more information.

Comment author: Luke_A_Somers 20 March 2017 09:49:08PM 2 points [-]

A) the audit notion ties into having our feedback cycles nice and tight, which we all like here.

B) This would be a little more interesting if he linked to his advance predictions on the war so we could compare how he did. And of course if he had posted a bunch of other predictions so we could see how he did on those (to avoid cherry-picking). That would rule out rear-view-mirror effects.

Comment author: Benquo 23 March 2017 05:48:00PM 0 points [-]

Strong agreement on (B).

Comment author: Benquo 20 March 2017 08:39:38PM 0 points [-]

Good ideas do not need lots of lies told about them in order to gain public acceptance.

There's one construction of this which is obviously false - lies being told in support of X doesn't inherently discredit X, because often there are also lies being told supporting not-X, and they can't both be false. But in the stock options example, Davies is pointing to something more specific: a principled argument for lies, on the grounds that they are necessary to support the desirable policy. His application of this to the Iraq war generalizes this somewhat: when you find people explaining away the misleading statements of the principal advocates for an action or proposition, as just part of a sales pitch, you should suspect that the lies are in fact central to the case, and not just accidental.

Fibbers’ forecasts are worthless.

This is a pretty radical claim. It makes the most sense in conjunction with the last point, about audits. In the absence of any force holding people to account, once they've shown themselves willing to mislead at all, we should expect people to lie quite a lot. But, in practice there are varying levels of audit and I'm not sure what cognitive simplifications to use.

Comment author: Lumifer 20 March 2017 03:20:41PM *  1 point [-]

What interesting ideas do you find here? This looks like a ranty of-course-it's-clear-in-the-rearview-mirror "wisdom" to me.

Comment author: Benquo 20 March 2017 07:44:53PM 0 points [-]

Putting zero weight on the estimates of people or institutions with a track record of misrepresentations seems obvious but also really hard to do, so it's interesting to see what sort of person can do it anyway, despite substantial social momentum on the other side. Overall, this seems like an extension of the recent Slate Star Codex post about lying on the internet. If lying is cheap and effective, then this level of caution is entirely appropriate.

To give a decision-relevant example, I think this sort of attitude would have long since given up on something like EA as mostly worthless. Is that excessively skeptical?

Comment author: Benquo 19 March 2017 06:54:31PM *  2 points [-]

Good ideas do not need lots of lies told about them in order to gain public acceptance. I was first made aware of this during an accounting class. We were discussing the subject of accounting for stock options at technology companies. There was a live debate on this subject at the time. One side (mainly technology companies and their lobbyists) held that stock option grants should not be treated as an expense on public policy grounds; treating them as an expense would discourage companies from granting them, and stock options were a vital compensation tool that incentivised performance, rewarded dynamism and innovation and created vast amounts of value for America and the world. The other side (mainly people like Warren Buffet) held that stock options looked awfully like a massive blag carried out my management at the expense of shareholders, and that the proper place to record such blags was the P&L account.

Our lecturer, in summing up the debate, made the not unreasonable point that if stock options really were a fantastic tool which unleashed the creative power in every employee, everyone would want to expense as many of them as possible, the better to boast about how innovative, empowered and fantastic they were. Since the tech companies’ point of view appeared to be that if they were ever forced to account honestly for their option grants, they would quickly stop making them, this offered decent prima facie evidence that they weren’t, really, all that fantastic. [...]

Fibbers’ forecasts are worthless. Case after miserable case after bloody case we went through, I tell you, all of which had this moral. Not only that people who want a project will tend to make innacurate projections about the possible outcomes of that project, but about the futility of attempts to “shade” downward a fundamentally dishonest set of predictions. If you have doubts about the integrity of a forecaster, you can’t use their forecasts at all. Not even as a “starting point”. By the way, I would just love to get hold of a few of the quantitative numbers from documents prepared to support the war and give them a quick run through Benford’s Law.

Application to Iraq This was how I decided that it was worth staking a bit of credibility on the strong claim that absolutely no material WMD capacity would be found, rather than “some” or “some but not enough to justify a war” or even “some derisory but not immaterial capacity, like a few mobile biological weapons labs”. My reasoning was that Powell, Bush, Straw, etc, were clearly making false claims and therefore ought to be discounted completely, and that there were actually very few people who knew a bit about Iraq but were not fatally compromised in this manner who were making the WMD claim. [...]

The Vital Importance of Audit. Emphasised over and over again. Brealey and Myers has a section on this, in which they remind callow students that like backing-up one’s computer files, this is a lesson that everyone seems to have to learn the hard way. Basically, it’s been shown time and again and again; companies which do not audit completed projects in order to see how accurate the original projections were, tend to get exactly the forecasts and projects that they deserve. Companies which have a culture where there are no consequences for making dishonest forecasts, get the projects they deserve. Companies which allocate blank cheques to management teams with a proven record of failure and mendacity, get what they deserve.

[...] The raspberry road that led to Abu Ghraib was paved with bland assumptions that people who had repeatedly proved their untrustworthiness, could be trusted. There is much made by people who long for the days of their fourth form debating society about the fallacy of “argumentum ad hominem”. There is, as I have mentioned in the past, no fancy Latin term for the fallacy of “giving known liars the benefit of the doubt”, but it is in my view a much greater source of avoidable error in the world. Audit is meant to protect us from this, which is why audit is so important.

[Link] The D-Squared Digest One Minute MBA – Avoiding Projects Pursued By Morons 101

1 Benquo 19 March 2017 06:48PM
In response to Threat erosion
Comment author: Benquo 16 March 2017 04:31:53AM *  1 point [-]

For instance, there is a line in the sandwich shop. From a perspective so naive to our ubiquitous norms that it is hard to imagine, you might wonder why the person standing at the back does so, when the shopkeeper is much more likely to get sandwiches for people at the front. The reason of course is that if he were to position himself in the ample physical space between the person at the front and the shopkeeper, there would be some kind of uproar. Not only would the person at the front be angry, but everyone in the line would back them up, and the shopkeeper probably wouldn’t even grant a sandwich to the line-jumper.

[...]

If our norms work well enough, we might go for years all peacefully standing in line, without anyone ever trying to push in at the front, because why would they?

An upshot is that if serious norm violations are rare, people might become pragmatically ill-equipped to respond to them. They might forget how to, or they might stop having the right resources to do so, physical or institutional. Or if generations are passing with no violations, the new generation might just fail to ever learn that they are meant to respond to violations, or learn what that would look like, since they never observe it. And maybe nobody notices any of this until norms are being violated and they find they have no response.

For instance, suppose that occasionally people sort of wander toward the front of the line in ambiguous circumstances, hoping to evade punishment by feigning innocent confusion. And those in the line always loudly point out the ‘error’ and the room scowls and the person is virtually always scared into getting in line. But one day someone just blatantly walks up to the front of the line. People point out the ‘error’ but the person says it is not an error: they are skipping the line.

The people in the line have never seen this. They only have experience quietly mentioning that they observe a possible norm violation, because that has always been plenty threatening. Everyone has become so used to believing that there is terrifying weaponry ready to be pulled out if there really were a real norm violation, that nobody has any experience pulling it out.

And perhaps it has been so long since anyone did pull it out that the specific weapons they stashed away for this wouldn’t even work any more. Maybe the threat used to be that everyone watching would gossip to others in the town about how bad you were. But now in a modern sandwich shop in a large city, that isn’t even a threat.

The world is full of sufficiently different people that in the real world, maybe someone would just punch you in the face. But it seems easy to imagine a case where nobody does anything. Where they haven’t been in this situation for so long, they can’t remember whether there is another clause in their shared behavior pattern that says if you punch someone because they got in line in front of you at the sandwich shop that you should be punished too.

[Link] Threat erosion

1 Benquo 16 March 2017 04:30AM
Comment author: lifelonglearner 16 March 2017 12:57:39AM 1 point [-]

I'm unsure I have a good internal picture of what sincerity is pointing at. Does being sincere differ much from "truly, actually, super-duper, very much so" believing in something?

Comment author: Benquo 16 March 2017 02:22:11AM *  4 points [-]

I think I mean the same thing you mean by "real beliefs, rather than, say, belief-in-belief". So, I'm saying, it's not confirmation bias that causes the good thing, it's sincerity that makes the confirmation bias comparatively harmless.

Comment author: Lumifer 15 March 2017 09:04:42PM 1 point [-]

Maybe psychology research would be better science if it were a field of entertainment, then.

Oh, but it already is.

Just read RealPeerReview on twitter.

Comment author: Benquo 15 March 2017 09:54:45PM *  0 points [-]

Most of that's not experimental psychology. Nor is it stuff originally marketed overtly as entertainment content.

Comment author: lifelonglearner 15 March 2017 02:24:13PM 0 points [-]

The title says that sufficiently sincere confirmation bias is indistinguishable from real science. But I don't see how this differs too much from real science (the attitude of the NYU people versus scientists.)

You say:

What made this work? I think what happened is that they took their own beliefs literally. They actually believed that people hated Hillary because she was a woman, and so their idea of something that they were confident would show this clearly was a fair test.

I'm a little confused. Isn't this just saying that these people held real beliefs, rather than, say, belief-in-belief? So when contrary evidence appeared, they were able to change their mind?

I dunno; I feel not super convinced that its confirmation bias which causes this sort of good epistemic behavior? (As in, I wouldn't expect this sort of thing in this sort of situation to happen much and this is maybe unique?)

Comment author: Benquo 15 March 2017 08:14:36PM 1 point [-]

It's sincerity that causes this sort of behavior.

View more: Next