New Comment
14 comments, sorted by Click to highlight new comments since:

Here's an article on Engadget about the AMA: http://www.engadget.com/2015/10/09/stephen-hawking-ai-reddit-ama/

A 5K+ karma AMA on Reddit, and an article on a mainstream gadget website, discussing AI safety and even citing Steve Omohundro, right in the article. This is a huge success. Properly discussed AI risk is now officially mainstream. It makes me proud that I was a part of this success as a SIAI donor.

I like this quote by Stephen Hawking from one of his answers:

The real risk with AI isn’t malice but competence.

I think he got the point across well, for some of the common responses I kind of wish there was a much much shorter and easier to digest version of The Hidden Complexity of Wishes which could fit into a reddit post.

I know that people don't like to read long texts, the longer the text the more likely people are to skip it so I've tried to edit it down to a cliff-notes version that doesn't go into probability pumps etc.

The Hidden Complexity of Wishes also uses too many less-wrong specific terms.

http://lesswrong.com/lw/ld/the_hidden_complexity_of_wishes/

There are three kinds of genies: Genies to whom you can safely say "I wish for you to do what I should wish for"; genies for which no wish is safe; and genies that aren't very powerful or intelligent.

Suppose your aged mother is trapped in a burning building, and it so happens that you're in a wheelchair; you can't rush in yourself. You could cry, "Get my mother out of that building!" but there would be no one to hear. Luckily you have, in your pocket, a magic lamp with a genie. Unfortunately it does not have Robin Williams voice and doesn't have a human mind or human morality and is constrained to fulfilling wishes through reasonably physically plausible means.

BOOM! With a thundering roar, the gas main under the building explodes. As the structure comes apart, in what seems like slow motion, you glimpse your mother's shattered body being hurled high into the air.

This is a genie of the second class. No wish is safe. It's smart but smart is not the same thing as sharing all your values and you get what you wish for, not what you want.

You are human. If someone asked you to get their poor aged mother out of a burning building, you might help, or you might pretend not to hear. But it wouldn't even occur to you to explode the building. "Get my mother out of the building" sounds like a much safer wish than it really is, because you don't even consider the plans that you assign extreme negative values.

Perhaps you just have to cover all the bases....

"I wish to move my mother (defined as the woman who shares half my genes and gave birth to me) to outside the boundaries of the building currently closest to me which is on fire; but not by exploding the building; nor by causing the walls to crumble so that the building no longer has boundaries; nor by waiting until after the building finishes burning down for a rescue worker to take out the body..."

All these special cases, the seemingly unlimited number of required patches should hint to you that this is not a good approach. Miss one of the thousands of special cases and you're likely to end up with a horrible outcome.

If your mother's foot is crushed by a burning beam, is it worthwhile to extract the rest of her? What if her head is crushed, leaving her body? What if her body is crushed, leaving only her head? Is Terry Schiavo a person? How much is a chimpanzee worth?

We value many things, and no they are not reducible to valuing happiness or valuing reproductive fitness.

The only safe genie is a genie that shares all your judgment criteria, and at that point, you can just say "I wish for you to do what I should wish for."

To be a safe fulfiller of a wish, a genie must share the same values that led you to make the wish or it may fail to exclude horrible side effects that would lead you to not even consider a plan in the first place.

Giving a goal to an advanced AI which has the potential ability to improve itself is like making a wish and we don't yet know a safe way of giving an AI the instruction "I wish for you to do what I should wish for."

I never liked that article. It says "there are three types of genies", and then, rather than attempting to prove the claim or argue for it, it just provides an example of a genie for which no wish is safe. I mean, fine, I'm convinced that specific genie sucks. But there may well be other genies that don't know what you want but have the ability to give it to you if you ask (when I was 5 years old, my mom was such a genie).

when I was 5 years old, my mom was such a genie

Your mother was human and her goals were likely very tightly aligned with yours. She was probably an extremely safe genie and was likely more concerned with what you actually needed rather than what you asked for.

If a 5 year old asked for a metric ton of candy and a bag of huge fireworks she's likely such a safe genie that she'd simply not grant your wish even if it was what you really wanted, even if she fully understood why you wanted it and even if she was fully capable of granting it.

At age 5 you could safely wish for "I wish for you to do what I should wish for" and at worst you'd be a little disappointed if what she came up with wasn't as fun as you'd have liked.

At age 5 you could safely wish for "I wish for you to do what I should wish for" and at worst you'd be a little disappointed if what she came up with wasn't as fun as you'd have liked.

I would have gotten the wrong flavor of ice cream. It was strictly better to specify the flavor of ice cream I preferred. Therefore, the statement about the 3 types of genies is simply false. It might be approximately true in some sense, but even if it is, the article never gives any arguments in favor of that thesis, it simply gives one example.

Wait, to be clear, you're calling getting the wrong flavor of icecream a "safety" issue?

Do you have any examples that actually fall outside the 3 types? Your mother is likely not powerful and nor is she a superintelligence. So far the only example you've given has fallen squarely in the third category but even if scaled up would probably fit quite well in the first.

I'd also note that the claim you're taking issue with is a metaphor for explaining things, he's not claiming that magical genies actually exist of any category.

I'm making 2 points:

  1. His metaphor completely fails conceptually, because I'm perfectly capable of imagining genies that fall outside the three categories.

  2. Perhaps the classification works in some other setting, such as AIs. However, the article never provided any arguments for this (or any arguments at all, really). Instead, there was one single example (seriously, just one example!) which was then extrapolated to all genies.

Ok, so, do you actually have any examples that fall outside the 3 categories:

1:Powerful+Safe

2:Powerful+Unsafe

3:Not very powerful such that it doesn't matter so much if they're safe or unsafe.

Examples of what? Of hypothetical intelligent minds? I feel like there are examples all over fiction; consider genies themselves, which often grant wishes in a dangerous way (but you can sometimes get around it by speaking carefully enough). Again, I agree that some genies are never safe and some are always safe, but it's easy to imagine a genie which is safe if and only if you specify your wish carefully.

Anyway, do you concede the point that EY's article contains no arguments?

If you have to speak "carefully enough" then you're taking a big risk though you may luck out and get what you want, they're not safe.

EY's article contains arguments, you just seem to have picked up on something that wasn't what he was arguing about.

It's like someone started a speech with "Good evening ladies and gentlemen." and your criticism was that he failed to prove that it was evening, failed to prove that there was a mix of genders in the audience and that the entirety of the rest of the speech failed to contain any arguments about whether the men in the audience were in fact gentlemen.

It contained a very clear and well made argument for why simply trying to word your wish carefully was a fools errand.

You may notice how it starts with an overly complex wish from the "open source wish project". It then gives examples of how simply adding clauses to the wish to get your mother out doesn't help much because you value so many things as a human that you'd have to add so many thousands of disclaimers, clauses and rules that it would be insane while missing even one could mean disaster(from your point of view) which is extremely unsafe.

If you have to speak "carefully enough" then you're taking a big risk though you may luck out and get what you want, they're not safe.

If your argument is that unless a powerful being is extremely safe, then they're not extremely safe, this is true by definition. Obviously, if a genie sometimes doesn't give you what you want, there is some risk that the genie won't give you what you want. I thought a more substantial argument was being made, though - it sounded like EY was claiming that saying "I wish for whatever I should wish for" is supposed to always be better than every other wish. This claim is certainly false, due to the "mom" example. So I guess I'm left being unsure what the point is.

I see you've not bothered reading any of my replies and instead just made up your own version in your head.

Your mom example falls quite cleanly into the third catagory if it doesn't fall cleanly into the first.

Unless a powerful being understands your values well enough to take them into account and actually wants to take them into account then it's not extremely safe. Yes.

Believe it or not there are a lot of people who'll do things like insist that that's not the case or insist that you just have to wish carefully enough hence the need for the article.

I see you've not bothered reading any of my replies and instead just made up your own version in your head.

I read all of your replies. What are you referring to? Also, this is uncharitable/insulting.

Believe it or not there are a lot of people who'll do things like insist that that's not the case or insist that you just have to wish carefully enough hence the need for the article.

To be honest, I'm not sure what we're even disagreeing about. Like, sure, some genies are unsafe no matter how you phrase your wish. For other genies, you can just wish for "whatever I ought to wish for". For still other genies, giving some information about your wish helps.

If EY's point was that the first type of genies exist, then yes, he's made it convincingly. If his point is that you never need to specify a wish other than "whatever I ought to wish for" (assuming a genie is powerful enough), then he failed to provide arguments for this claim (and the claim is probably false).