JGWeissman comments on Ben Goertzel: The Singularity Institute's Scary Idea (and Why I Don't Buy It) - Less Wrong

32 Post author: ciphergoth 30 October 2010 09:31AM

You are viewing a comment permalink. View the original post to see all comments and the full post content.

Comments (432)

You are viewing a single comment's thread. Show more comments above.

Comment author: JGWeissman 02 November 2010 04:59:59PM *  1 point [-]

The law of comparative advantage relies on some implicit assumptions that are not likely to hold between a superintelligence and humans:

The transactions costs must be small enough not to negate the gains from trade. A superintelligence may require more resources to issue a trade request to slow thinking humans and to receive the result, while possibly letting processes idle while waiting for the result, than to just do it itself.

Your trading partner must not have the option of building a more desirable trading partner out of your component parts. A superintelligence could get more productivity of atoms arranged as an extension of itself than atoms arranged as humans. (ETA: See Nesov's comment.)