JoshuaZ comments on Singularity Summit 2010 on Aug. 14-15 in San Francisco - Less Wrong

7 Post author: alyssavance 02 June 2010 06:01AM

You are viewing a comment permalink. View the original post to see all comments and the full post content.

Comments (22)

You are viewing a single comment's thread. Show more comments above.

Comment author: JoshuaZ 02 June 2010 04:35:21PM 2 points [-]

One major divide on both LW and Overcoming Bias is estimates about the probability of a Singularity in the near future. However, I'm a bit puzzled by your remark about efficiency. First of all, increases in efficiency can lead to otherwise impractical technologies becoming practical. For example, the portable tapeplayer when it originally came out was not an intrinsically new technology but rather much more efficient implementation of existing technology. Similarly, computer networks have been around since the 1960s but the internet became such a major impact because of the increasing efficiency (measured in cost and speed for example) of computing technologies. If one believes that a the advent of AI will lead to AI that are much smarter than humans, they should be able to quickly make many technologies much more efficient. To use one example, the primary problem with a space elevator is making carbon nanotubes cheaply and reliably enough. If an AI can come up with a solution for that then the cost of going from Earth to orbit will be reduced by a few orders of magnitude. That alone would have a lot implications. Now, apply the same logic to hundreds of potential technologies.

If you think that friendly even just moderately smart AI will occur, one can project that it will potentially result in lots of changes.

(I should probably add a disclaimer that I don't assign a Singularity-type event a high probability. If I'm not presenting the position well, please correct me.)