gwern comments on Hedging our Bets: The Case for Pursuing Whole Brain Emulation to Safeguard Humanity's Future - Less Wrong

11 Post author: inklesspen 01 March 2010 02:32AM

You are viewing a comment permalink. View the original post to see all comments and the full post content.

Comments (244)

You are viewing a single comment's thread. Show more comments above.

Comment author: AngryParsley 01 March 2010 03:32:40AM *  -1 points [-]

I agree with a lot of your points about the advantages of WBE vs friendly AI. That said, look at the margins. Quite a few people are already working on WBE. Not very many people are working on friendly AI. Taking this into consideration, I think an extra dollar is better spent on FAI research than WBE research.

Also, a world of uploads without FAI would probably not preserve human values for long. The uploads that changed themselves in such a way to grow faster (convert the most resources or make the most copies of themselves) would replace uploads that preserved human values. For example, an upload could probably make more copies of itself it if deleted its capacities for humor and empathy.

We already have a great many relatively stable and sane intelligences.

I don't think any human being is stable or sane in the way FAI would be stable and sane.

Comment author: pjeby 01 March 2010 04:02:03AM 1 point [-]

For example, an upload could probably make more copies of itself it if deleted its capacities for humor and empathy.

If you were an upload, would you make copies of yourself? Where's the fun in that? The only reason I could see doing it is if I wanted to amass knowledge or do a lot of tasks... and if I did that, I'd want the copies to get merged back into a single "me" so I would have the knowledge and experiences. (Okay, and maybe some backups would be good to have around). But why worry about how many copies you could make? That sounds suspiciously Clippy-like to me.

In any case, I think we'd be more likely to be screwed over by uploads' human qualities and biases, than by a hypothetical desire to become less human.

Comment author: gwern 01 March 2010 02:41:38PM *  1 point [-]

But why worry about how many copies you could make? That sounds suspiciously Clippy-like to me.

This is, I think, an echo of Robin Hanson's 'crack of a future dawn', where hyper-Darwinian pressures to multiply cause the discarding of unuseful mental modules like humor or empathy which take up space.

Comment author: RobinHanson 02 March 2010 02:58:03AM 2 points [-]

Where do you get the idea that humor or empathy are not useful mental abiliites?!

Comment author: gwern 02 March 2010 01:56:16PM *  1 point [-]

From AngryParsley...