Second Life creators to attempt to create AI

0 Post author: nick012000 09 January 2011 01:50PM

http://nwn.blogs.com/nwn/2010/02/philip-rosedale-ai.html

http://www.lovemachineinc.com/

Should I feel bad for hoping they'll fail? I do not want to see the sort of unFriendly AI would be created after being raised on social interactions with pedophiles, Gorians, and furries. Seriously, those are some of the more prominent of the groups still on Second Life, and an AI that spends its formative period interacting with them (and the first two, especially) could develop a very twisted morality.

Comments (13)

Comment author: Oscar_Cunningham 09 January 2011 06:40:52PM *  5 points [-]

If it were to FOOM any social norms it absorbed at all would probably make it better not worse. In a kind of "∞ minus 1" way.

Comment author: Normal_Anomaly 09 January 2011 07:11:41PM 0 points [-]

Good point, but I'm not entirely sure. Being turned into a Gorian could be worse than being turned into paperclips.

Comment author: khafra 09 January 2011 09:45:44PM 3 points [-]

"Being turned into a Gorian" is within the range of enough human desires to make it a significant subculture, although it's repugnant to a much larger section of the culture. I have never heard of anyone with a fantasy of being turned into paperclips. So which is better seems to depend on how you sum utility over all the involved actors.

Comment author: Vladimir_Nesov 09 January 2011 05:16:46PM 13 points [-]

I do not want to see the sort of unFriendly AI would be created after being raised on social interactions with pedophiles, Gorians, and furries.

Bad Parenting is not even on the list of reasons you don't get a FAI.

Comment author: nick012000 09 January 2011 05:21:21PM 0 points [-]

Oh, they'd almost certainly get an unFriendly AI regardless of how they parented it, but bad parenting could very easily make an unFriendly AI worse. Especially if it interacts a lot with the Goreans, and comes to the conclusion that women want to be enslaved, or something similar.

Comment author: benelliott 09 January 2011 06:32:13PM *  3 points [-]

That probably won't make much of a difference, since there's no reason it should care what anyone wants.

Comment author: katydee 10 January 2011 05:56:49PM *  2 points [-]

If an AI imposed Goreanism on mankind, that would constitute a Friendly AI Critical Failure (specifically, failure 13), not a UFAI.

Comment author: khafra 09 January 2011 03:33:12PM 7 points [-]

At a cursory examination, this attempt qualifies as Not Even Wrong; I wouldn't worry about it.

Comment author: PhilGoetz 11 January 2011 03:06:51AM *  3 points [-]

Upvoted, because I hate to see somebody get downvoted for providing information (on-topic and short).

Comment author: Normal_Anomaly 09 January 2011 06:59:06PM *  2 points [-]

Should I feel bad for hoping they'll fail?

Not at all. I certainly hope so, and this (from their site) makes it sound very likely that they will:

The Brain. Can 10,000 computers become a person?

Comment author: JoshuaZ 10 January 2011 04:02:15AM 2 points [-]

Downvoting for gratuitous attack on irrelevant subcultures.

Comment author: ewang 11 January 2011 01:52:19AM 0 points [-]

Considering the fact that the goal of this project is synonymous with Strong AI, I don't think that 10,000 computers CAN become a person.