You're looking at Less Wrong's discussion board. This includes all posts, including those that haven't been promoted to the front page yet. For more information, see About Less Wrong.

Jiro comments on Open thread, Dec. 22 - Dec. 28, 2014 - Less Wrong Discussion

5 Post author: Gondolinian 22 December 2014 02:34AM

You are viewing a comment permalink. View the original post to see all comments and the full post content.

Comments (218)

You are viewing a single comment's thread. Show more comments above.

Comment author: Jiro 28 December 2014 03:54:00AM 2 points [-]

Folks will ask questions like "how do we balance the usefulness of energy against the danger to the environment from using energy". And the answer is "we should never get into a situation where we have to make that choice".

Of course, anyone who actually gave that answer to that question would be speaking nonsense. In a non-ideal world, sometimes you won't be able to maximize or minimize two things simultaneously. It may not be possible to never endanger either the passengers or pedestrians, just like it may not be possible to never give up using energy and never endanger the environment. It's exactly the wrong answer.

Comment author: fubarobfusco 28 December 2014 04:54:47AM 1 point [-]

Sure, you want to make sure the behavior in a no-win situation isn't something horrible. It would be bad if the robot realized that it couldn't avoid a crash, had an integer overflow on its danger metric, and started minimizing safety instead of maximizing it. That's a thing to test for.

But consider the level of traffic fatalities we have today.

How much could we reduce that level by making drivers who are better at making moral tradeoffs in an untenable, no-win, gotta-crash-somewhere situation ... and how much could we reduce it by making drivers who are better at avoiding untenable, no-win, gotta-crash-somewhere situations in the first place?

I suggest that the latter is a much larger win — a much larger reduction in fatalities — and therefore far more morally significant.