You're looking at Less Wrong's discussion board. This includes all posts, including those that haven't been promoted to the front page yet. For more information, see About Less Wrong.

ChrisHallquist comments on Dr. Jubjub predicts a crisis - Less Wrong Discussion

50 Post author: Apprentice 10 January 2014 03:52PM

You are viewing a comment permalink. View the original post to see all comments and the full post content.

Comments (65)

You are viewing a single comment's thread. Show more comments above.

Comment author: ChrisHallquist 11 January 2014 05:27:17AM 4 points [-]

If I believe that, I would forget about AI, x-risk and just focus on third-world poverty.

Comment author: Lumifer 11 January 2014 05:53:23AM 1 point [-]

Well, it's up to you to decide how much the uncertainty of outcome should influence your willingness to do something. It's OK to think it's worthwhile to follow a certain path even if you don't know where would it ultimately lead.

Comment author: ChrisHallquist 11 January 2014 05:59:45AM 3 points [-]

"Uncertainty" is different than "no clue." Or maybe I'm assuming too much about what you mean by "no clue" - to my ear it sounds like saying we have no basis for action.

Comment author: Lumifer 11 January 2014 02:55:33PM 1 point [-]

Large amounts of uncertainty including the paradoxical possibility of black swans == no clue.

it sounds like saying we have no basis for action

You have no basis for action if you are going to evaluate your actions on the basis of consequences in a hundred years.

Comment author: [deleted] 11 January 2014 05:52:00AM 0 points [-]

You don't have more information about the hundred-year effects of your third-world poverty options than you do about the hundred-year effects of your AI options.

Comment author: ChrisHallquist 11 January 2014 06:00:46AM 6 points [-]

Effects of work on AI are all about the long run. Working on third-world poverty, on the other hand, has important and measurable short-run benefits.

Comment author: [deleted] 11 January 2014 06:02:24AM 3 points [-]

Good point!