James_Miller comments on Less Wrong: Open Thread, September 2010 - Less Wrong

3 Post author: matt 01 September 2010 01:40AM

You are viewing a comment permalink. View the original post to see all comments and the full post content.

Comments (610)

You are viewing a single comment's thread.

Comment author: James_Miller 01 September 2010 04:36:53AM *  0 points [-]

Eliezer has been accused of delusions of grandeur for his belief in his own importance. But if Eliezer is guilty of such delusions then so am I and, I suspect, are many of you.

Consider two beliefs:

  1. The next millennium will be the most critical in mankind’s existence because in most of the Everett branches arising out of today mankind will go extinct or start spreading through the stars.

  2. Eliezer’s work on friendly AI makes him the most significant determinant of our fate in (1).

Let 10^N represent the average across our future Everett branches of the total number of sentient beings whose ancestors arose on earth. If Eliezer holds beliefs (1) and (2) then he considers himself the most important of these beings and the probability of this happening by chance is 1 in 10^N. But if (1) holds then the rest of us are extremely important as well through how our voting, buying, contributing, writing… influences mankind’s fate. Let say that makes most of us one of the trillion most important beings who will ever exist. The probability of this happening by chance is 1 in 10^(N-12).

If N is at least 18 it’s hard to think of a rational criteria under which believing you are 1 in 10^N is delusional whereas thinking you are 1 in 10^(N-12) is not.

Comment author: JamesAndrix 01 September 2010 05:41:48AM 4 points [-]

2 is ambiguous. Getting to the stars requires a number of things to go right. Eliezer serves relatively little use in preventing a major nuclear exchange in the next 10 years, or bad nanotech , or garage made bio weapons, or even UFAI development.

FAI is just the final thing that needs to go right, everything else needs to go mostly right until then.

Comment author: Snowyowl 01 September 2010 11:19:59AM 2 points [-]

And I can think of a few ways humanity can get to the stars even if FAI never happens.

Comment author: KevinC 01 September 2010 05:02:39AM *  3 points [-]

Can you provide a cite for the notion that Eliezer believes (2)? Since he's not likely to build the world's first FAI in his garage all by himself, without incorporating the work of any of other thousands of people working on FAI and FAI's necessary component technologies, I think it would be a bit delusional of him to beleive (2) as stated. Which is not to suggest that his work is not important, or even among the most significant work done in the history of humankind (even if he fails, others can build on that and find the way that works). But that's different than the idea that he, alone, is The Most Significant Human Who Will Ever Live. I don't get the impression that he's that cocky.

Comment author: James_Miller 01 September 2010 05:19:27AM 2 points [-]

Eliezer has been accused on LW of having or possibly having delusions of grandeur for essentially believing in (2). See here:

http://lesswrong.com/lw/2lr/the_importance_of_selfdoubt/

My main point is that even if Eliezer believes in (2) we can't conclude that he has such delusions unless we also accept that many LW readers also have such delusions.

Comment author: wedrifid 01 September 2010 04:58:45AM *  3 points [-]

If N is at least 18 it’s hard to think of a rational criteria under which believing you are 1 in 10^N is delusional whereas thinking you are 1 in 10^(N-12) is not.

Really? How about "when you are, in fact, 1/10^(N-12) and have good reason to believe it"? Throwing in a large N doesn't change the fact that 10^N is still 1,000,000,000,000 times larger than 10^(N-12) and nor does it mean we could not draw conclusions about belief (2).

(Not commenting on Eliezer here, just suggesting the argument is not all that persuasive to me.)

Comment author: James_Miller 01 September 2010 05:05:31AM 1 point [-]

To an extremely good approximation one in a million events don't ever happen.

Comment author: wedrifid 01 September 2010 05:11:32AM *  2 points [-]

To an extremely good approximation this Everett Branch doesn't even exist. Well, it wouldn't if I used your definition of 'extremely good'.

Comment author: James_Miller 01 September 2010 05:28:29AM *  1 point [-]

Your argument seems to be analogous to the false claim that it's remarkable that a golf ball landed exactly where it did (regardless of where it did land) because the odds of that happening were extremely small.

I don't think my argument is analogous because there is reason to think that being one of the most important people to ever live is a special happening clearly distinguishable from many, many others.

Comment author: gwern 01 September 2010 01:44:17PM 1 point [-]

Yet they are quite easy to generate - flip a coin a few times.

Comment author: Snowyowl 01 September 2010 12:03:21PM 1 point [-]

I agree. Somebody has to be the most important person ever. If Elizer really has made significant contributions to the future of humanity, he's much more likely to be that most important person than a random person out of 10^N candidates would be.

Comment author: James_Miller 01 September 2010 02:25:24PM 1 point [-]

The argument would be that Eliezer should doubt his own ability to reason if his reason appears to cause him to think he is 1 in 10^N. My claim is that if this argument is true everyone who believes in (1) and thinks N is large should, to an extremely close approximation, have just as much doubt in their own ability to reason as Eliezer should have in his.

Comment author: Snowyowl 01 September 2010 03:12:45PM 1 point [-]

Agreed. Not sure if Eliezer actually believes that, but I take your point.

Comment author: timtyler 08 September 2010 07:34:51AM 0 points [-]

10^N is still 1,000,000,000,000 times larger than 10^(N-12)

Here, here. That is a trillion times more probable!

Comment author: rwallace 01 September 2010 04:54:57PM 2 points [-]

It's not about the numbers, and it's not about Eliezer in particular. Think of it this way:

Clearly, the development of interstellar travel (if we successfully accomplish this) will be one of the most important events in the history of the universe.

If I believe our civilization has a chance of achieving this, then in a sense that makes me, as a member of said civilization, important. This is a rational conclusion.

If I believe I'm going to build a starship in my garage, that makes me delusional. The problem isn't the odds against me being the one person who does this. The problem is that nobody is going to do this, because building a starship in your garage is simply impossible; it's just too hard a job to be done that way.

Comment author: Houshalter 03 September 2010 01:30:04AM 0 points [-]

If I believe I'm going to build a starship in my garage, that makes me delusional. The problem isn't the odds against me being the one person who does this. The problem is that nobody is going to do this, because building a starship in your garage is simply impossible; it's just too hard a job to be done that way.

You assume it is. But maybe you will invent AI and then use it to design a plan of how to build a starship in your garage. So it's not simply impossible. It's just unknown and even if you could theres no reason to believe that would be a good decision. But hey, in a hundred years, who knows what people will build in their garages, or the equivalent of. I immagine people a hundred years ago would believe our projects to be pretty strange.

Comment author: prase 01 September 2010 02:15:03PM 1 point [-]

I think I don't understand (1) and its implications. How the fact that in most of the branches we are going extinct implies that we are the most important couple of generations (this is how I interpret the trillion)? Our importance lies in our decisions. These decisions influence the number of branches in which people die out. If we take (1) as given, it means we weren't successful in mitigating the existential risk, leaving no place to excercise our decisions and thus importance.