private_messaging comments on Thoughts on the Singularity Institute (SI) - Less Wrong

256 Post author: HoldenKarnofsky 11 May 2012 04:31AM

You are viewing a comment permalink. View the original post to see all comments and the full post content.

Comments (1270)

You are viewing a single comment's thread. Show more comments above.

Comment author: private_messaging 18 May 2012 08:48:30AM *  0 points [-]

Yet at bottom, we're probably all (except for private_messaging) thinking the same thing: that FinalState almost certainly has no way of creating an AGI

nah, I stated that probability of him creating AGI is epsilon (my probability for his project hurting me is microscopic epsilon while the SI hurting him somehow is a larger epsilon, I only stated a relation that the latter is larger than former. The probability of a person going unfriendly is way, way higher than the probability of a person creating AGI that kills us all).

I think we're all here for various sarcastic or semi sarcastic points; my point is that given the SI stance, AGI researchers would (and have to) try to keep away from SI, especially those whom have some probability of creating an AGI, given combination of probability of useful contribution by SI versus probability of SI going nuts.

Comment author: Normal_Anomaly 18 May 2012 05:24:59PM 0 points [-]

I never thought you disagreed with:

that FinalState almost certainly has no way of creating an AGI

I actually meant that I thought you disagreed with:

and that no-one involved need feel threatened by anyone else.

Sorry for the language ambiguity. If you think the probability of SI hurting FinalState is epsilon, I misunderstood you. I thought you thought it was a large enough probability to be worth worrying about and warning FinalState about.