multifoliaterose comments on Other Existential Risks - Less Wrong
You are viewing a comment permalink. View the original post to see all comments and the full post content.
You are viewing a comment permalink. View the original post to see all comments and the full post content.
Comments (120)
EY argues: "... your smart friends and favorite SF writers are not remotely close to the rationality standards of Less Wrong, and you will no longer think it anywhere near as plausible that their differing opinion is because they know some incredible secret knowledge you don't."
and you respond by saying that there have been people smarter than Eliezer that have suffered rationality fails when working outside their domain? Isn't that kinda the point?
EY wasn't arguing "My IQ is so damn high that I just have to be right. Look at my ability to generate novel hypothesis! It clearly shows high IQ!", which would indeed be foolish. It is understood here that high innate intelligence is not the same as real world effectiveness, which requires one be intelligent about how they use their intelligence.
The object of the game here is to evaluate hypothesis which have already been generated (ie SIAI claims). EY was showing that there are many very smart people that can't even evaluate the MWI hypothesis when it is handed to them and there is slam dunk evidence.
If you can't even get the right answer on simple questions, how the heck are you supposed to do better on tough problems than those that see the simple problems as, well... simple?
EDIT: It seems like my point did not come off clearly. I am not arguing that it is not an appeal to authority.
I am arguing that high IQ is different from "has lots of knowledge" which is different from "knows the fundamental rules of how to weigh evidence and evaluate claims", and that Eliezer was talking about the last one.
If only there had been detailed critical analysis of claims (1) and (2) on Less Wrong or the SIAI website I would find your comment compelling. But in light of the fact that detailed critical analysis of these significant claims has not taken place I believe that Eliezer's remarks are in fact properly conceptualized as an appeal to authority.
I totally agree that it's an appeal to authority. My point was that it's an appeal to a different and more relevant kind of authority.
Do you disagree with
?
If so, why?
Yes, I mostly disagree.
The first part is giving an example of high IQ not leading to a good existential risk plan, and the second part is saying that you expect that high ability to weigh evidence won't lead to a good plan either.
The counterexample proves that high IQ isn't everything one needs, but overall, I'd still expect it to help. I think "no bearing" is too strong even for an IQ->IQ comparison of that sort.
If you're going to assume you've been exposed to all the plans that people have come up with, picking the right plan is more of a claim evaluation job than a novel hypothesis generation job. For this, you're going to want someone that can evaluate claims like MWI easily. I think that this is sufficiently close to the case to make your comparison a poor one.
If I were going to make a comparison to make your point (to the degree which I agree with it), I'd use more than one person with more than one strength of intellect and instead ask "do we really think EY has shown enough to succeed where most talented people fail?". I'd also try to make it clear whether I'm arguing against him having a 'majority' of the probability mass in his favor vs having a 'plurality' of it going for him. It's a lot easier to argue against the former, but it's the latter that is more important if you have to pick someone to give money to.
But how well does the ability to evaluate evidence connected with quantum mechanics correlate with ability to evaluate evidence connected with existential risk?
See also the thread here