Kevin comments on That Magical Click - Less Wrong

58 Post author: Eliezer_Yudkowsky 20 January 2010 04:35PM

You are viewing a comment permalink. View the original post to see all comments and the full post content.

Comments (400)

You are viewing a single comment's thread. Show more comments above.

Comment author: Kutta 20 January 2010 08:55:16PM *  37 points [-]

Is that really just it? Is there no special sanity to add, but only ordinary madness to take away?

I think this is the primary factor. I've got a pretty amusing story about this.

Last week I met a relatively distant relative, a 15 year old guy who's in a sports oriented high school. He plays football, has not much scientific, literary or intellectual background, and is quite average and normal in most conceivable ways. Some TV program on Discovery was about "robots", and in a shortly unfolding 15 minute spontaneous conversation I've managed to explain him the core problems of FAI, without him getting stuck at any points of my arguments. I'm fairly sure that he had no previous knowledge about the subject.

First I made a remark in connection to the TV program's poetic question about what if robots will be able to get most human work done; I said that if robots get the low wage jobs, humans would eventually get paid more on average, and the problem is only there when robots can do everything humans can and somehow end up actually doing all those things.

Then he asked if I think they'll get that smart, and I answered that it's quite possible in this century. I explained recursive self-improvement in two sentences, to illustrate the reasons why they could potentially get very, very smart in a small amount of time. I talked about the technology that would probably allow AIs to act upon the world with great efficiency and power. Next, he said something like "that's good, wouldn't AI's would be a big help, like, they will invent new medicine?" At this point I was pretty amused. I assured him that AIs indeed have great potentials. I talked then very shortly about most basic AI topics, providing the usual illustrations like Hollywood AIs, smiley-tiled solar systems and foolish programmers overlooking the complexity of value. I delineated CEV in a simplified "redux" manner, focusing on the idea that we should optimally just extract all relevant information from human brains by scanning them, to make sure nothing we care about is left out. "That should be a huge technical problem, to scan that much brains", he said.

And now:

"But if the AI gets so potent, would not it be a problem anyway, even if it's perfectly friendly, that it can do everything much better than humans, and we'll get bored?"

"Hahh, not at all. If you think that getting all bored and unneeded is bad, then it is a real preference inside your head. It'll be taken into account by the AI, and it will make sure it'll not pamper you excessively."

"Ah, that sounds pretty reasonable".

Now, all of this happened in the course of roughly 15 minutes. No absurdity heuristic, no getting lost, no objections; he just took everything I said at face value, assuming that I'm more knowledgeable on these matters, and I was in general convinced that nothing I explained was particularly hard to grasp. He asked relevant questions and was very interested in what I said.

Some thoughts why this was possible:

  • The guy belongs to a certain social strata in Hungary, namely to those who newly entered the middle class by free entrepreneurship that became a possibility after the country switched to capitalism. At first, the socialist regime repressed religion and just about every human rights, then eased up, softened, and became what's known as the "happiest barrack". People became unconcerned with politics (which they could not influence) and religion (which was though of as a highly personal matter that should not be taken to public), they just focused on their own wealth and well-being. I'm convinced that the parents of the guy care zero about any religion, the absence of religion, doctrine, ideology or whatever. They just work to make a living and don't think about lofty matters, leaving their son ideologically perfectly intact. Just like my own parents.

  • Actually, AI is not intrinsically abstract or hard to digest; my interlocutor knew what an AI is, even if from movies, and probably watched just enough Discovery to have a sketchy picture about future technologies. The mind design space argument is not that hard (he had known about evolution because it's taught in school. He immediately agreed that AIs can be much smarter than humans because if we wait a million years, maybe humans can also become much smarter, so it's technically possible), and the smiley-tiled solar system is an entertaining and effective explanation about morality. I think that Eliezer has put extreme amounts of effort to maximize the chance that his AI ideas will get transmitted even to people who are primed or biased against AI or at risk of motivated skepticism. So far, I've had great success using his parables, analogues and ways of explanation.

  • My perceived status as an "intellectual" made him accept my explanations at face value. He's a football player in a smallish countryside city and I'm a serious college student in the capital city (it's good he doesn't know how lousy a student I am). Still, I do not think this was a significant factor. He probably does not talk about AI among football players, but being a male he has some basic interests in futuristic or gadgety subjects.

In the end, it probably all comes down to lacking some specific ways of craziness. Cryonics seemed normal on that convention Eliezer attended, and I'm sure every idea that is epistemically and morally correct can in principle be a so-called normal thing. Besides this guy, I've even had full success lecturing a 17 year old metal drummer on AI and SIAI - and he was situated socioeconomically very similarly to the first guy, and neither he had any previous knowledge.

Comment author: Kevin 22 January 2010 01:30:13PM *  5 points [-]

This is a great post, and I'd be interested in seeing you write out a fuller version of what you said to your relative as a top level post, something like "Friendly AI and the Singularity explained for adolescents."

Also, do you speak English as a second language? If so, I am especially impressed with your writing ability.

On a tangent, am I the only one that doesn't like the usage of boy, girl, or child to describe adolescents? It seems demeaning, because adolescents are not biologically children, they've just been defined to be children by the state. I suppose I'm never going to overturn that usage, but I'd like to know if there is some reason why I shouldn't be bothered by the common usage of the words for children.

Comment author: Kutta 22 January 2010 03:41:40PM *  7 points [-]

Yes, English is second language for me and I mostly learned it via reading things on the Internet.

Excuse me for the boy/guy confusion, I did not have any particular intent behind the wording. It was an unconscious application of my native language's tendency to refer to <18 year old males with the "boy" equivalent word. As I'm mostly a lurker I have much less writing than reading experience; currently I usually make dozens of spelling/formulation corrections on longer posts, but some weirdly used words or mistakes are guaranteed to remain in the text.

Comment author: Kevin 22 January 2010 07:55:51PM 2 points [-]

The boy usage is correct in English as well; I just don't like that usage, but I'm out of the mainstream.

Comment author: AdeleneDawner 22 January 2010 02:16:26PM 1 point [-]

On a tangent, am I the only one that doesn't like the usage of boy, girl, or child to describe adolescents?

You're not. I find it demeaning and more than a little confusing.

Comment author: pdf23ds 22 January 2010 02:29:30PM *  2 points [-]

"Child" is probably never OK for people older than 12-13, but "girl", "guy", and occasionally "boy" are usually used by teens, and often by 20-somethings to describe themselves or each other. ("Boy" usually by females, used with a sexual connotation.)

Comment author: AdeleneDawner 22 January 2010 02:34:28PM *  3 points [-]

I'm aware of it, and am actually still getting into the habit of referring to women about my age or younger as women rather than girls. I still trip over it when other people use the words that way, though - I automatically think of 8-year-olds if it's not very clear who's being referred to.

Comment author: pdf23ds 22 January 2010 03:10:29PM 3 points [-]

I automatically think of 8-year-olds if it's not very clear who's being referred to.

Right. "Girl" really has at least two distinct senses, one for children and one for peers/juniors of many ages. "Guy" isn't used in the first sense, and the second sense of "boy" is more restricted. The first sense of "boy"/"girl" is the most salient one, and thus the default absent further context. I don't think the first sense needs to poison the second one. But its use in the parent comment this discussion wasn't all that innocent. (I've been attacked before, by a rather extreme feminist, for using it innocently.)