multifoliaterose comments on Best career models for doing research? - Less Wrong

27 Post author: Kaj_Sotala 07 December 2010 04:25PM

You are viewing a comment permalink. View the original post to see all comments and the full post content.

Comments (999)

You are viewing a single comment's thread. Show more comments above.

Comment author: multifoliaterose 11 December 2010 06:38:39AM *  6 points [-]

It seems to me that the natural effect of a group leader persistently arguing from his own authority is Evaporative Cooling of Group Beliefs. This is of course conducive to confirmation bias and corresponding epistemological skewing for the leader; things which seem undesirable for somebody in Eliezer's position. I really wish that Eliezer was receptive to taking this consideration seriously.

Comment author: wedrifid 11 December 2010 06:44:00AM *  4 points [-]

It seems to me that the natural effect of a group leader persistently arguing from his own authority is Evaporative Cooling of Group Beliefs. This is of course conducive to confirmation bias and corresponding epistemological skewing for the leader; things which seem undesirable for somebody in Eliezer's position. I really wish that Eliezer was receptive to taking this consideration seriously.

The thing is he usually does. That is one thing that has in the past set Eliezer apart from Robin and impressed me about Eliezer. Now it is almost as though he has embraced the evaporative cooling concept as an opportunity instead of a risk and gone and bought himself a blowtorch to force the issue!

Comment author: JGWeissman 11 December 2010 07:29:40AM 2 points [-]

Maybe, given the credibility he has accumulated on all these other topics, you should be willing to trust him on the one issue on which he is asserting this authority and on which it is clear that if he is right, it would be bad to discuss his reasoning.

Comment author: wedrifid 11 December 2010 08:17:51AM 10 points [-]

Maybe, given the credibility he has accumulated on all these other topics, you should be willing to trust him on the one issue on which he is asserting this authority and on which it is clear that if he is right, it would be bad to discuss his reasoning.

The well known (and empirically verified) weakness in experts of the human variety is that they tend to be systematically overconfident when it comes to judgements that fall outside their area of exceptional performance - particularly when the topic is one just outside the fringes.

When it comes to blogging about theoretical issues of rationality Eliezer is undeniably brilliant. Yet his credibility specifically when it comes to responding to risks is rather less outstanding. In my observation he reacts emotionally and starts making rookie mistakes of rational thought and action. To the point when I've very nearly responded 'Go read the sequences!' before remembering that he was the flipping author and so should already know better.

Also important is the fact that elements of the decision are about people, not game theory. Eliezer hopefully doesn't claim to be an expert when it comes to predicting or eliciting optimal reactions in others.

Comment author: JGWeissman 11 December 2010 05:45:00PM -2 points [-]

Yet his credibility specifically when it comes to responding to risks is rather less outstanding.

We were talking about his credibility in judging whether this idea is a risk, and that is within his area of expertise.

Comment author: wedrifid 11 December 2010 06:19:28PM *  5 points [-]

Was it not clear that I do not assign particular credence to Eliezer when it comes to judging risks? I thought I expressed that with considerable emphasis.

I'm aware that you disagree with my conclusions - and perhaps even my premises - but I can assure you that I'm speaking directly to the topic.

Comment author: David_Gerard 11 December 2010 11:27:54AM *  3 points [-]

I took wedrifid's point as being that whether EY is right or not, the bad effect described happens. This is part of the lose-lose nature of the original problem (what to do about a post that hurt people).

Comment author: XiXiDu 11 December 2010 11:28:41AM *  5 points [-]

Maybe, given the credibility he has accumulated on all these other topics, you should be willing to trust him on the one issue on which he is asserting this authority and on which it is clear that if he is right, it would be bad to discuss his reasoning.

I do not consider this strong evidence as there are many highly intelligent and productive people who hold crazy beliefs:

  • Francisco J. Ayala who “…has been called the “Renaissance Man of Evolutionary Biology” is a geneticist ordained as a Dominican priest. “His “discoveries have opened up new approaches to the prevention and treatment of diseases that affect hundreds of millions of individuals worldwide…”
  • Francis Collins (geneticist, Human Genome Project) noted for his landmark discoveries of disease genes and his leadership of the Human Genome Project (HGP) and described by the Endocrine Society as “one of the most accomplished scientists of our time” is a evangelical Christian.
  • Peter Duesberg (a professor of molecular and cell biology at the University of California, Berkeley) claimed that AIDS is not caused by HIV, which made him so unpopular that his colleagues and others have — until recently — been ignoring his potentially breakthrough work on the causes of cancer.
  • Georges Lemaître (a Belgian Roman Catholic priest) proposed what became known as the Big Bang theory of the origin of the Universe.
  • Kurt Gödel (logician, mathematician and philosopher) who suffered from paranoia and believed in ghosts. “Gödel, by contrast, had a tendency toward paranoia. He believed in ghosts; he had a morbid dread of being poisoned by refrigerator gases; he refused to go out when certain distinguished mathematicians were in town, apparently out of concern that they might try to kill him.”
  • Mark Chu-Carroll (PhD Computer Scientist, works for Google as a Software Engineer) “If you’re religious like me, you might believe that there is some deity that created the Universe.” He is running one of my favorite blogs, Good Math, Bad Math, and writes a lot on debunking creationism and other crackpottery.
  • Nassim Taleb (the author of the 2007 book (completed 2010) The Black Swan) does believe: Can’t track reality with science and equations. Religion is not about belief. We were wiser before the Enlightenment, because we knew how to take knowledge from incomplete information, and now we live in a world of epistemic arrogance. Religious people have a way of dealing with ignorance, by saying “God knows”.
  • Kevin Kelly (editor) is a devout Christian. Writes pro science and technology essays.
  • William D. Phillips (Nobel Prize in Physics 1997) is a Methodist.

I could continue this list with people like Ted Kaczynski or Roger Penrose. I just wanted show that intelligence and rational conduct do not rule out the possibility of being wrong about some belief.

Comment author: JGWeissman 11 December 2010 05:31:01PM -2 points [-]

How extensively have you searched for experts who made correct predictions outside their fields of expertise? What would you expect to see if you just searched for experts making predictions outside their field of expertise and then determined if that prediction were correct? What if you limited your search to experts who had expressed the attitude Eliezer expressed in Outside the Laboratory?

I just wanted show that intelligence and rational conduct do not rule out the possibility of being wrong about some belief.

"Rule out"? Seriously? What kind of evidence is it?

Comment author: wedrifid 11 December 2010 06:39:00PM *  8 points [-]

"Rule out"? Seriously? What kind of evidence is it?

You extracted the "rule out" phrase from the sentence:

I just wanted show that intelligence and rational conduct do not rule out the possibility of being wrong about some belief.

From within the common phrase 'do not rule out the possibility' no less!

You then make a reference to '0 and 1s not probabilities' with exaggerated incredulity.

To put it mildly this struck me as logically rude and in general poor form. XiXiDu deserves more courtesy.

Comment author: JGWeissman 11 December 2010 07:19:07PM 1 point [-]

You extracted the "rule out" phrase from the sentence:

I just wanted show that intelligence and rational conduct do not rule out the possibility of being wrong about some belief.

From within the common phrase 'do not rule out the possibility' no less!

None of this affects my point that ruling out the possibility is the wrong, (in fact impossible), standard.

You then make a reference to '0 and 1s not probabilities' with exaggerated incredulity.

Not exaggerated. XiXiDu's post did seem to be saying: here are these examples of experts being wrong so it is possible that an expert is wrong in this case, without saying anything useful about how probable it is for this particular expert to be wrong on this particular issue.

To put it mildly this struck me as logically rude and in general poor form.

You have made an argument accusing me of logical rudeness that, quite frankly, does not stand up to scrutiny.

Comment author: XiXiDu 11 December 2010 06:31:09PM *  3 points [-]

What kind of evidence is it?

Better evidence than I've ever seen in support of the censored idea. I have these well-founded principles, free speech and transparency, and weigh them against the evidence I have in favor of censoring the idea. That evidence is merely 1.) Yudkowsky's past achievements, 2.) his output and 3.) intelligence. That intelligent people have been and are wrong about certain ideas while still being productive and right about many other ideas is evidence to weaken #3. That people lie and deceive to get what they want is evidence against #1 and #2 and in favor of transparency and free speech, which are both already more likely to have a positive impact than the forbidden topic is to have a negative impact.

And what are you trying to tell me with this link? I haven't seen anyone stating numeric probability estimations regarding the forbidden topic. And I won't state one either, I'll just say that it is subjectively improbable enough to ignore it because there are possible too many very-very-low-probability events to take into account (for every being that will harm me if I don't do X there is another being that will harm me if I do X, which cancel out each other). But if you'd like to pull some number out of thin air, go ahead. I won't because I don't have enough data to even calculate the probability of AI going FOOM versus a slow development.

Comment author: JGWeissman 11 December 2010 07:05:00PM 0 points [-]

You have failed to address my criticisms of you points, that you are seeking out only examples that support your desired conclusion, and that you are ignoring details that would allow you to construct a narrower, more relevant reference class for your outside view argument.

And what are you trying to tell me with this link?

I was telling you the "ruling out the possibility" is the wrong, (in fact impossible), standard.

Comment author: XiXiDu 11 December 2010 07:35:34PM 1 point [-]

You have failed to address my criticisms of you points, that you are seeking out only examples that support your desired conclusion.

Only now I understand your criticism. I do not seek out examples to support my conclusion but to weaken your argument that one should trust Yudkowsky because of his previous output. I'm aware that Yudkowsky can very well be right about the idea but do in fact believe that the risk is worth taking. Have I done extensive research on how often people in similar situations have been wrong? Nope. No excuses here, but do you think there are comparable cases of predictions that proved to be reliable? And how much research have you done in this case and about the idea in general?

I was telling you the "ruling out the possibility" is the wrong, (in fact impossible), standard.

I don't, I actually stated a few times that I do not think that the idea is wrong.

Comment author: JGWeissman 11 December 2010 07:47:26PM 0 points [-]

I do not seek out examples to support my conclusion but to weaken your argument that one should trust Yudkowsky because of his previous output.

Seeking out just examples that weaken my argument, when I never predicted that no such examples would exist, is the problem I am talking about.

What made you think that supporting your conclusion and weakening my argument are different things?

Comment author: XiXiDu 11 December 2010 08:09:06PM *  0 points [-]

Seeking out just examples that weaken my argument, when I never predicted that no such examples would exist, is the problem I am talking about.

My reason to weaken your argument is not that I want to be right but that I want feedback about my doubts. I said that 1.) people can be wrong, regardless of their previous reputation, 2.) that people can lie about their objectives and deceive by how they act in public (especially when the stakes are high), 3.) that Yudkowsky's previous output and achievements are not remarkable enough to trust him about some extraordinary claim. You haven't responded on why you tell people to believe Yudkowsky, in this case, regardless of my objections.

What made you think that supporting your conclusion and weakening my argument are different things?

I'm sorry if I made it appear as if I hold some particular belief. My epistemic state simply doesn't allow me to arrive at your conclusion. To highlight this I argued in favor of what it would mean to not accept your argument, namely to stand to previously well-established concepts like free speech and transparency. Yes, you could say that there is no difference here, except that I do not care about who is right but what is the right thing to do.

Comment author: Vladimir_Nesov 11 December 2010 10:37:04PM 0 points [-]

I do not seek out examples to support my conclusion but to weaken your argument that one should trust Yudkowsky because of his previous output.

You shouldn't seek to "weaken an argument", you should seek what is the actual truth, and then maybe ways of communicating your understanding. (I believe that's what you intended anyway, but think it's better not to say it this way, as a protective measure against motivated cognition.)

Comment author: jsalvatier 11 December 2010 11:09:40PM *  0 points [-]

I like your parenthetical, I often want to say something like this, and you've put it well.

Comment author: shokwave 11 December 2010 05:35:15PM 0 points [-]

Thank-you for pointing this out.

Comment author: Vladimir_Nesov 11 December 2010 12:00:04PM 0 points [-]

Taleb quote doesn't qualify. (I won't comment on others.)

Comment author: XiXiDu 11 December 2010 01:05:52PM *  2 points [-]

Taleb quote doesn't qualify. (I won't comment on others.)

I should have made more clearly that it is not my intention to indicate that I believe that those people, or crazy ideas in general, are wrong. But there are a lot of smart people out there who'll advocate opposing ideas. Using their reputation of being highly intelligent to follow through on their ideas is in my opinion not a very good idea in itself. I could just believe Freeman Dyson that existing simulation models of climate contain too much error to reliably predict future trends. I could believe Peter Duesberg that HIV does not cause aids, after all he is a brilliant molecular biologist. But I just do not think that any amount of reputation is enough evidence to believe extraordinary claims uttered by such people. And in the case of Yudkowsky, there doesn't even exist much reputation and no great achievements at all that would justify some strong belief in his infallibility. What there exists in Yudkowsky's case seems to be strong emotional commitment. I just can't tell if he is honest. If he really believes that he's working on a policy for some future superhuman intelligence that will rule the universe, then I'm going to be very careful. Not because it is wrong, but because such beliefs imply huge payoffs. Not that I believe he is the disguised Dr. Evil, but can we be sure enough to just trust him with it? Censorship of certain ideas does bear more evidence against him as it does in favor of his honesty.

Comment author: multifoliaterose 11 December 2010 06:57:22AM 1 point [-]

Huh, so there was a change? Curious. Certainly looking over some of Eliezer's past writings there are some that I identify with a great deal.

Comment author: wedrifid 11 December 2010 07:08:56AM 4 points [-]

Huh, so there was a change?

Far be it from me to be anything but an optimist. I'm going with 'exceptions'. :)