gwern comments on [LINK] Nick Szabo: Beware Pascal's Scams - Less Wrong
You are viewing a comment permalink. View the original post to see all comments and the full post content.
You are viewing a comment permalink. View the original post to see all comments and the full post content.
Comments (82)
Dunning Kruger effect is likely a product of some general deficiency in the meta reasoning facility leading both to failure of reasoning itself and failure of evaluation of the reasoning; extremely relevant to people that proclaim themselves to be more rational, more moral, and so on than anyone else but do not seem to accomplish above mediocre performance at fairly trivial yet quantifiable things.
Ghmm. He said first people to take money, not first people to tackle.
The first people to explain the universe (and take some contributions for that) produced something of negative value, nearly all of the medicine until last couple hundred years was not only ineffective but completely harmful, and so on.
If you look at very narrow definitions, of course, the first to tackle nuclear bomb creation did succeed - but the first to tackle the general problem of weapon of mass destruction were various shamans sending a curse. If saving people from AI is an easy problem, then we'll survive without SI; if it's a hard problem, at any rate SI doesn't start with a letter from Einstein to the government, it starts with a person with no quantifiable accomplishments cleverly employing oneself. As far as I am concerned, there's literally no case for donations here; the donations happen via sort of decision noise similar to how NASA has spent millions on various antigravity devices, the power companies have spent millions on putting electrons in hydrogen orbitals at below ground level (see Mills hydrinos), and millions were invested in Steorn's magnetic engine.
That seems unlikely. Leading both?
Mediocrity is sufficient to push them entirely out of the DK gap; your thinking DK applies is just another example of what I mean by these being fragile easily over-interpreted results.
(Besides blatant misapplication, please keep in mind that even if DK had been verified by meta-analysis of dozens of laboratory studies, which it has not, that still only gives a roughly 75% chance that the effect applies outside the lab.)
Without specifics, one cannot argue against that.
So you're just engaged in reference class tennis. ('No, you're wrong because the right reference class is magicians!')
Seems straightforward to me: Eliezer's unwarranted self importance did result in him not pursuing education or for that matter proper self education, and simultaneously to believing he's awesome and selling existential risk reduction that nobody else would sell. edit: The alternative explanation is the level of resistance to self deception so high that the process of the self education transcended the necessity to seek objective feedback on the progress (which one gets if one e.g. tries to prove mathematical theorems, as here an un-intelligent process of checking a proof can validate one's powers of reasoning).
Did it ever occur to you that one has to actually do something incompatible with the broad reference class to get into much much smaller reference class? E.g. you are in reference class 'people', not reference class 'people with IQ>=150' unless you take IQ test or take other test with very low false positive rate. Likewise, the reference class is 'people with grand promises' until you actually do something that moves you into microscopic sub class of 'people with grand promises who deliver'.
Suppose one were to grant that for Eliezer. Out of curiosity, I would be interested in hearing how Nick Bostrom & FHI are similarly deluded and in the reference class of magicians.
What is the reasonable probability you think I should assign to the proposition by some bunch of guys (with at most some accomplishments in highly non-gradable field of philosophy) led by a person with no formal education nor prior job experience nor quantifiable accomplishments, that they should be given money to hire more people to develop their ideas on how to save the world from a danger they are most adept at seeing? The prior here is so laughably low you can hardly find a study so flawed it wouldn't be a vastly greater explanation for the SI behavior than it's mission statement taken at face value, even if we do not take into account SI's prior record.
Reference class is not up for grabs. If you want narrower reference class you need to substantiate why it should be so narrow.
edit: Actually, sorry it comes as unnecessarily harsh. But do you recognize that SI genuinely has a huge credibility problem?
The donations to SI only make sense if we are to assume SI has extremely rare survival ability vs the technological risks. Low priors for extremely rare anything are a tautology, not an opinion. The lack of other alternatives is evidence against SI's cause.
What is this, the second coming of C.S. Lewis and his trilemma? SI must either be completely right and demi-gods who will save us all or they must be deluded fools who suffer from some psychological bias - can you really think of no intermediates between 'saviors of humanity' and 'deluded fools who cannot possibly do any good', which might apply?
I just wanted to point out that invoking DK is an incredible abuse of psychological research and does not reflect well on either you or Dymtry, and now you want me to justify SI entirely...
Alternatives would also be evidence against donating, too, since what makes you think they are the best one out of all the alternatives? Curious how either way, one should not donate!