loqi comments on How to get that Friendly Singularity: a minority view - Less Wrong

12 Post author: Mitchell_Porter 10 October 2009 10:56AM

You are viewing a comment permalink. View the original post to see all comments and the full post content.

Comments (69)

You are viewing a single comment's thread. Show more comments above.

Comment author: loqi 13 October 2009 07:44:18AM *  1 point [-]

Can you send me a model?

No, sorry, the above comment was just my attempt to explain my objection as unambiguously as possible.

It seems obvious to me that if we embarked on a singleton strategy, even with the best of intentions, there are now N+1 AGI projects, each increasing existential risk, and our best intentions might not outweigh that increase.

Yes, but your "N+1" hides some important detail: Our effective contribution to existential risk diminishes as N grows, while our contribution to safer outcomes stays constant or even grows (in the case that our work has a positive impact on someone else's "winning" project).

I think my objection is to the binariness of the possible strategies node, but I'm not sure how to express that best in your model. [...] They would try to decrease existential risk both directly (e.g. build tools for the AGI projects that reduce the chance of the AGI projects going wrong) and indirectly, by not contributing to the problem.

Since you were making the point that attempting to build Friendly AGI contributes to existential risk, I thought it fair to factor out other actions. The two strategies you outline above are entirely independent, so they should be evaluated separately. I read you as promoting the latter strategy independently when you say:

By explicitly going for general-purpose, no-human-dependencies, and indefinitely self-improvable, you're building in exactly the same elements that you suspect are dangerous.

The choice under consideration is binary: Attempt a singleton or don't. Safety strategies may also be worthwhile, but I need a better reason than "they're working toward the same goal" to view them as relevant to the singleton question.