paulfchristiano comments on Superintelligence Reading Group - Section 1: Past Developments and Present Capabilities - Less Wrong

25 Post author: KatjaGrace 16 September 2014 01:00AM

You are viewing a comment permalink. View the original post to see all comments and the full post content.

Comments (232)

You are viewing a single comment's thread. Show more comments above.

Comment author: paulfchristiano 16 September 2014 03:50:47AM 5 points [-]

Meanwhile, the intelligence of a single person, even a single genius, taken in isolation and only allowed to acquire limited resources actually is not all that dangerous.

While I broadly agree with this sentiment, I would like to disagree with this point.

I would consider even the creation of a single very smart human, with all human resourcefulness but completely alien values, to be a significant net loss to the world. If they represent 0.001% of the world's aggregative productive capacity, I would expect this to make the world something like 0.001% worse (according to humane values) and 0.001% better (according to their alien values).

The situation is not quite so dire, if nothing else because of gains for trade (if our values aren't in perfect tension) and the ability of the majority to stomp out the values of a minority if it is so inclined. But it's in the right ballpark.

So while I would agree that broadly human capabilities are not a necessary condition for concern, I do consider them a sufficient condition for concern.