V_V comments on Andrew Ng dismisses UFAI concerns - Less Wrong

3 Post author: OphilaDros 06 March 2015 05:26AM

You are viewing a comment permalink. View the original post to see all comments and the full post content.

Comments (22)

You are viewing a single comment's thread. Show more comments above.

Comment author: MrMind 06 March 2015 02:07:26PM 0 points [-]

maybe there will be some AI that turn evil

That's the critical mistake. AIs don't turn evil. If they could, we would have FAI half-solved.

AIs deviate from their intended programming, in ways that are dangerous for humans. And it's not thousands of years away, it's away just as much as a self-driving car crashing into a group of people to avoid a dog crossing the street.

Comment author: V_V 06 March 2015 04:17:09PM *  2 points [-]

AIs deviate from their intended programming, in ways that are dangerous for humans. And it's not thousands of years away, it's away just as much as a self-driving car crashing into a group of people to avoid a dog crossing the street.

But that's a very different kind of issue than AI taking over the world and killing or enslaving all humans.

EDIT:

To expand: all technologies introduce safety issues.
Once we got fire some people got burnt. This doesn't imply that UFFire (Unfriendly Fire) is the most pressing existential risk for humanity and we must devote huge amount of resources to prevent it and never use fire until we have proved that it will not turn "unfriendly".

Comment author: robertzk 07 March 2015 06:12:05AM *  0 points [-]

However, UFFire does not uncontrollably exponentially reproduce or improve its functioning. Certainly a conflagration on a planet covered entirely by dry forest would be an unmitigatable problem rather quickly.

In fact, in such a scenario, we should dedicate a huge amount of resources to prevent it and never use fire until we have proved it will not turn "unfriendly".

Comment author: Locaha 08 March 2015 10:00:17AM -2 points [-]

However, UFFire does not uncontrollably exponentially reproduce or improve its functioning. Certainly a conflagration on a planet covered entirely by dry forest would be an unmitigatable problem rather quickly.

Do you realize this is a totally hypothetical scenario?

Comment author: MrMind 09 March 2015 08:05:08AM 0 points [-]

Well, there's a phoenomenon called "flash over", that realizes in a confined environment, and happens when the temperature of a fire becomes so high that all the substances within starts to burn and feed the reaction.

Now, imagine that the whole world could become a closed environment for the flashover...

Comment author: V_V 09 March 2015 09:46:46PM 0 points [-]

So we should stop using fire until we prove that the world will not burst into flames?