orthonormal comments on Changing accepted public opinion and Skynet - Less Wrong

15 [deleted] 22 May 2009 11:05AM

You are viewing a comment permalink. View the original post to see all comments and the full post content.

Comments (58)

You are viewing a single comment's thread. Show more comments above.

Comment author: orthonormal 25 May 2009 04:35:50PM 0 points [-]

You know, sci-fi that took the realities of mindspace somewhat seriously could be helpful in raising the sanity waterline on AGI; a well-imagined clash between a Friendly AI and a Paperclipper-type optimizer (or just a short story about a Paperclipper taking over) might at least cause readers to rethink the Mind Projection Fallacy.

Comment author: Vladimir_Nesov 25 May 2009 04:44:15PM 1 point [-]

Won't work, the clash will only happen in their minds (you don't fight a war if you know you'll lose; you can just proceed directly to the final truce agreement). Eliezer's Three Worlds Collide is a good middle ground, with non-anthropomorphic aliens of human-level intelligence allowing to describe familiar kind of action.

Comment author: orthonormal 26 May 2009 01:26:42AM 1 point [-]

IAWYC, but one ingredient of sci-fi is the willingness to sacrifice some true implications if it makes for a better story. It would be highly unlikely for a FAI and a Paperclipper to FOOM at the same moment with comparable optimization powers such that each thinks it gains by battling the other, and downright implausible for a battle between them to occur in a manner and at a pace comprehensible to the human onlookers; but you could make some compelling and enlightening rationalist fiction with those two implausibilities granted.

Of course, other scenarios can come into play. Has anyone even done a good Paperclipper-takeover story? I know there's sci-fi on 'grey goo', but that doesn't serve this purpose: readers have an easy time imagining such a calamity caused by virus-like unintelligent nanotech, but often don't think a superhuman intelligence could be so devoted to something of "no real value".

Comment deleted 01 June 2009 11:59:14AM [-]
Comment author: orthonormal 01 June 2009 03:58:50PM 0 points [-]

That's... the opposite of what I was looking for. It's pretty bad writing, and it's got the Mind Projection Fallacy written all over it. (Skynet is unhappy and worrying about the meaning of good and evil?)

Comment deleted 01 June 2009 04:03:59PM *  [-]
Comment author: orthonormal 01 June 2009 07:03:15PM 1 point [-]

Ironically, a line from the original Terminator movie is a pretty good intuition pump for Powerful Optimization Processes:

It can't be bargained with. It can't be 'reasoned' with. It doesn't feel pity or remorse or fear and it absolutely will not stop, ever, until [it achieves its goal].