Alex_Altair comments on Welcome to Less Wrong! - Less Wrong

48 Post author: MBlume 16 April 2009 09:06AM

You are viewing a comment permalink. View the original post to see all comments and the full post content.

Comments (1953)

You are viewing a single comment's thread.

Comment author: Alex_Altair 21 July 2010 09:01:24PM 6 points [-]

I recently found Less Wrong through Eliezer's Harry Potter fanfic, which has become my second favorite book. Thank you so much Eliezer for reminding my how rich my Art can be.

I was also delighted to find out (not so surprisingly) that Eliezer was an AI researcher. I have, over the past several months, decided to change my career path to AGI. So many of these articles have been helpful.

I have been a rationalist since I can remember. But I was raised as a Christian, and for some reason it took me a while to think to question the premise of God. Fortunately as soon as I did, I rejected it. Then it was up to me to 1) figure out how to be immortal and 2) figure out morality. I'll be signing up for cryonics as soon as I can afford it. Life is my highest value because it is the terminal value; it is required for any other value to be possible.

I've been reading this blog every day since I've found it, and hope to get constant benefit from it. I'm usually quiet, but I suspect the more I read, the more I'll want to comment and post.

Comment author: Vladimir_Nesov 21 July 2010 09:17:32PM *  4 points [-]
Comment author: Alex_Altair 21 July 2010 09:37:45PM 0 points [-]

"AGI is death, you want Friendly AI in particular and not AGI in general."

I'm not sure of the technical definition of AGI, but essentially I mean a machine that can reason. I don't plan to give it outputs until I know what it does.

"'Life' is not the terminal value, terminal value is very complex."

I don't mean that life is the terminal value that all human's actions reduce to. I mean it in exactly the way I said above; for me to achieve any other value requires that I am alive. I also don't mean that every value I have reduces to my desire to live, just that, if it comes down to one or the other, I choose life.

Comment author: Vladimir_Nesov 21 July 2010 09:49:38PM *  3 points [-]

If you are determined to read the sequences, you'll see. At least read the posts linked from the wiki pages.

I'm not sure of the technical definition of AGI, but essentially I mean a machine that can reason. I don't plan to give it outputs until I know what it does.

Well, you'll have the same chance of successfully discovering that AI does what you want as a sequence of coin tosses spontaneously spelling out the text of "War and Peace". Even if you have a perfect test, you still need for the tested object to have a chance of satisfying the testing criteria. And in this case, you'll have neither, as reliable testing is also not possible. You need to construct the AI with correct values from the start.

I don't mean that life is the terminal value that all human's actions reduce to. I mean it in exactly the way I said above; for me to achieve any other value requires that I am alive.

Acting in the world might require you being alive, but it's not necessary for you to be alive in order for the world to have value, all according to your own preference. It does matter to you what happens with the world after you die. A fact doesn't disappear the moment it can no longer be observed. And it's possible to be mistaken about your own values.

Comment author: JGWeissman 21 July 2010 09:49:34PM *  3 points [-]

I'm not sure of the technical definition of AGI, but essentially I mean a machine that can reason. I don't plan to give it outputs until I know what it does.

I am not sure what you mean by "give it outputs", but you may be interested in this investigation of attempting to contain an AGI.

I don't mean that life is the terminal value that all human's actions reduce to. I mean it in exactly the way I said above; for me to achieve any other value requires that I am alive. I also don't mean that every value I have reduces to my desire to live, just that, if it comes down to one or the other, I choose life.

Then I think you meant that "Life is the instrumental value."

Comment author: Nick_Tarleton 21 July 2010 11:36:40PM 1 point [-]

Then I think you mant that "Life is the instrumental value."

to amplify: Terminal Values and Instrumental Values