Comment author: enfascination 23 February 2016 05:52:30AM 11 points [-]

I think a lot about signal detection theory, and I think that's still the best I can come up with for this question. There are false positives, there are false negatives, they are both important to keep in mind, the cost of reducing one is an increase in the other, humans and human systems will always have both.

So, for example, even the most over-generous public welfare system will have deserving people off the dole and even the most stingy system will have undeserving recipients (by whatever definition), so the question (for a welfare system, say) isn't how do we prevent abuse, but how many abusers are we willing to tolerate for every 100 deserving recipients we reject? Also useful in lots of medical discussions, legal discussions, pop science discussions, etc.

If there was one element of statistical literacy that you could magically implant in every head, what would it be?

3 enfascination 22 February 2016 07:53PM

Alternatively, what single concept from statistics would most improve people's interpretations of popular news and daily life events?

Comment author: Terdragon 23 March 2015 04:07:25PM 0 points [-]

Discussing "humility" and "arrogance" is difficult without careful definitions. I was thinking about this recently; this is how I would like to define them.

Whenever I end up feeling like I have been arrogant, it is because I underestimated someone else's abilities, and I ended up surprised by what they were capable of. If humility is the opposite of arrogance, then humility is the ability to accept, as your prior, that somebody you meet just might end up being more wise or more accomplished than you. To be arrogant is to fail to realize that you might have something to learn from other people.

(These definitions of arrogance and humility thus only relate to mental habits, not to social behaviors.)

Note how this makes humility valuable -- if you expect everyone around you to be dumb and inferior and not worth learning from, if you don't give others the chance to prove you otherwise, you're going to miss out on everything that you could be learning from them. I wouldn't expect your putative arrogant academic to have very many fruitful collaborations.

So yes, I would say that arrogance is bad intellectual hygiene -- it's having the wrong priors about the people around you.

Note also that it's also possible to be unfair to oneself in this way. Impostor syndrome should not be confused for humility. High self-esteem should not be confused for arrogance.

... I realize only after writing all of this that there's also intellectual arrogance and intellectual humility; it seems that they can be modeled the same way, but with ideas instead of people.

Comment author: enfascination 28 March 2015 09:37:17AM 1 point [-]

Maybe this says more about me than about the world, but if this was StackOverflow, this comment would get the star. Thanks.

Comment author: FrameBenignly 22 March 2015 04:54:08PM 1 point [-]

I believe you're confusing arrogance and closedmindedness. There is a correlation between openmindedness and intelligence. I do believe openmindedness is a form of rationality. I'm not sure how good rationality training is at increasing it. Personality Psychology indicates Openness to Experience is a fairly stable trait; not one that can be taught.

Comment author: enfascination 28 March 2015 09:27:20AM 0 points [-]

I believe you're confusing arrogance and closedmindedness.

Well, maybe that's the question. They're different, and you can have one without the other, but do they cooccur above chance? Maybe arrogance reduces your exposure to the occasional clever ideas that will inevitably come from people you've dismissed. That isn't closemindedness, it something more like as-if-closemindedness, but it would come to the same thing.

Clean real-world example of the file-drawer effect

2 enfascination 28 March 2015 09:06AM

I've only ever seen publication bias taught with made-up or near-miss examples.  Has anyone got a really well-documented case in which:

* (About) nine people independently get the idea for the same experiment because it seems like it should be there, and they all see that nothing has been published on it, so they all work on it, and all get a (true) null result.

* The tenth experiment is eventually published reporting an NHST effect of about p = 0.10 

* The slow (g)rumbling of science surfaces the nine previous, unpublished versions of that experiment and someone catches it and gets it all down, with citations and dates and the specifics of whichever effect these ten people found themselves rooting around for.

 

The most representative real-world example I've seen lately has been Bem/psi, but, as a pedagogical example, I find it too distracting.  The ideal example would report on an effect that's more sympathetic, that a sharp student or outsider would say "Yeah, I'd also have thought that effect would have come through."

 

Thanks.

Comment author: DanielLC 21 March 2015 08:41:02PM 3 points [-]

By "arrogance" do you mean overestimating your abilities, or just being annoying about it? The first one is problematic, but so is underestimating. You should calibrate yourself and get an accurate view of your abilities. The second one seems to be more about some kind of signalling. It has an obvious disadvantage of alienating people, but it must have some kind of advantage or people wouldn't do it.

Comment author: enfascination 21 March 2015 09:00:14PM 0 points [-]

I don't buy arguments of the form "it must be good otherwise we wouldn't do it," but that's just a quibble. I'd buy a signaling argument and you're right that I'm not clear on my terms. This is a stab, but the way I think I'm using arrogance is as using your high abilities to justify a inflated sense of self worth. OK, applying that back to the question, I don't see how an inflated sense of self-worth could make you a worse critical thinker. Maybe? I have to think about it more.

Is arrogance a symptom of bad intellectual hygeine?

12 enfascination 21 March 2015 07:59PM

I have this belief that humility is a part of good critical thinking, and that egoism undermines it.  I imagine arrogance as a kind of mind-death.  But I have no evidence, and no good mechanism by which it might be true.  In fact, I know the belief is suspect because I know that I want it to be true — I want to be able to assure myself that this or that intolerable academic will be magically punished with a decreased capacity to do good work. The truth could be the opposite: maybe hubris breeds confidence, and confidence results? After all, some of the most important thinkers in history were insufferable.

Is any link, positive or negative, between arrogance and reasoning too tenuous to be worth entertaining? Is humility a pretty word or a valuable habit? I don't know what I think yet.   Do you?

Comment author: enfascination 19 November 2013 01:40:25PM 0 points [-]

This work articulates an attack on the use of conjugate priors in a Bayesian analysis: http://projecteuclid.org/DPubS?service=UI&version=1.0&verb=Display&handle=euclid.ba/1340369826 In their words, "conjugate priors may lead to a dogmatic analysis."

Sorry for necro.

Comment author: Mitchell_Porter 14 January 2013 01:08:15AM -2 points [-]

Why do you care which scientific theories are right?

Comment author: enfascination 14 January 2013 01:42:26PM 1 point [-]

This post is less about The Truth that and more about science as a personal endeavor, as something you do on yourself to be a better thinker, or not.

Comment author: DanArmak 14 January 2013 12:37:22PM 1 point [-]

I want to be good at being wrong

Not 'less' wrong? I'm not even sure from reading the rest of your comment whether this is a typo or whether you intended something I didn't understand.

Comment author: enfascination 14 January 2013 01:38:29PM 2 points [-]

Sorry if I wasn't clear. I want to be good at admitting that I was in error, and to collect cases of great thinkers who failed to do that. So yes, I want to be less wrong. Admitting one's errors is a tool in the less wrong toolbox. We like to think we're good at it, that its easy, that we're detached, but I've seen that being content with my level of self-criticality creates complacency and fosters lapses. These examples demonstrate it.

View more: Next