JamesAndrix comments on Less Wrong: Open Thread, September 2010 - Less Wrong
You are viewing a comment permalink. View the original post to see all comments and the full post content.
You are viewing a comment permalink. View the original post to see all comments and the full post content.
Comments (610)
Eliezer has been accused of delusions of grandeur for his belief in his own importance. But if Eliezer is guilty of such delusions then so am I and, I suspect, are many of you.
Consider two beliefs:
The next millennium will be the most critical in mankind’s existence because in most of the Everett branches arising out of today mankind will go extinct or start spreading through the stars.
Eliezer’s work on friendly AI makes him the most significant determinant of our fate in (1).
Let 10^N represent the average across our future Everett branches of the total number of sentient beings whose ancestors arose on earth. If Eliezer holds beliefs (1) and (2) then he considers himself the most important of these beings and the probability of this happening by chance is 1 in 10^N. But if (1) holds then the rest of us are extremely important as well through how our voting, buying, contributing, writing… influences mankind’s fate. Let say that makes most of us one of the trillion most important beings who will ever exist. The probability of this happening by chance is 1 in 10^(N-12).
If N is at least 18 it’s hard to think of a rational criteria under which believing you are 1 in 10^N is delusional whereas thinking you are 1 in 10^(N-12) is not.
2 is ambiguous. Getting to the stars requires a number of things to go right. Eliezer serves relatively little use in preventing a major nuclear exchange in the next 10 years, or bad nanotech , or garage made bio weapons, or even UFAI development.
FAI is just the final thing that needs to go right, everything else needs to go mostly right until then.
And I can think of a few ways humanity can get to the stars even if FAI never happens.