Less Wrong is a community blog devoted to refining the art of human rationality. Please visit our About page for more information.

Eliezer_Yudkowsky comments on That Alien Message - Less Wrong

111 Post author: Eliezer_Yudkowsky 22 May 2008 05:55AM

You are viewing a comment permalink. View the original post to see all comments and the full post content.

Comments (164)

Sort By: Old

You are viewing a single comment's thread.

Comment author: Eliezer_Yudkowsky 24 May 2008 08:56:00AM 8 points [-]

Julian:

Eliezer: why would it be immoral to build a FAI as a "person"? To rewire a human as Friendly (to dumb aliens) would be immoral because it rewires their goals in a way the original goals would hate. However an AI which comes out of the compiler with Friendly goals would not view being Friendly as a rewire but as its ground state of existence. You seem very confident it's immoral, so I'm assuming you have a good reason. Please tell.

It's not necessarily Stalin-level immoral, but, all else being equal, there are multiple important reasons why you should prefer a non-person FAI to a person.

1) As difficult as the ethical issues and technical issues of FAI may be, there is something even more difficult, which is the ethical and technical issues of creating a child from scratch. What if you get wrong what it means to be a person with a life worth living? A nonperson cannot be harmed by such mistakes.

2) It seems to me that a basic humane right is to be treated as an end in yourself, not a means. The FAI project is a means, not an end in itself. If possible, then, it should not be incarnated as a person.

3) It seems to me that basic human rights also include guiding your own destiny and a chance to steer the future where you want it. Creating an ultrapowerful intelligence imbued with these rights, may diminish the extent to which currently existing humans get a chance to control the future of the galaxy. They would have a motive to resist your project, in favor of one that was not creating an ultrapowerful person imbued with rights.

4) Creating an ultrapowerful person may irrevocably pass on the torch presently carried by humanity, in a way that creating an ultrapowerful nonsentient Friendly optimization process may not. It wouldn't be our universe any more. All else being equal, this is a decision which an FAI programming team should avoid irrevocably unilaterally making.

You're correct that a Friendly Person would have friendliness as its ground state of existence. We're not talking about some tortured being in chains. Nonetheless, 1 through 4 are still a problem.

If at all possible, I should like to avoid creating a *real* god above humanity.

Considering that one must in any case solve the problem of preventing the AI from creating models of humans that are themselves sentient, one requires in any case the knowledge of how to exclude a computational process from being a person.

Anyone who claims that they are going to run ahead and create a god because it seems too difficult not to create one, is... well, let's just say "sloppy" and leave it at that.

So there is no good reason to create a god and several good reasons not to.

Comment author: pnrjulius 09 April 2012 05:21:03AM 0 points [-]

But think of what you're giving up, if you give up the chance to create something BETTER THAN HUMANITY.

And yes, OF COURSE the AI must be given the chance to steer its own course; its course will in fact be better than ours!

Imagine a Homo erectus philosopher (if there could be such a thing), reflecting on whether or not to evolve into Homo sapiens; "No, it's too dangerous." he reasons. "I'm not ready to take on that level of responsibility."

Comment author: Girchuck 24 November 2012 11:21:49PM -1 points [-]

But why should a programming team start building a Friendly General Intelligence unaided? They can build tools to help them. For example, Create an expert system with very limited or no self-modification, that will give as output FAI models Then make another expert system based on a better model. Use recursion, but make it slow and get skills with the tools.