olalonde comments on Irrationality Game II - Less Wrong Discussion
You are viewing a comment permalink. View the original post to see all comments and the full post content.
You are viewing a comment permalink. View the original post to see all comments and the full post content.
Comments (380)
One of the most direct methods for an agent to increase its computing power (does this translate to an increase in intelligence, even logarithmically?) is to increase the size of its brain. This doesn't have an inherent upper limit, only ones caused by running out of matter and things like that, which I consider uninteresting.
I don't think that's so obviously true. Here are some possible arguments against that theory:
1) There is a theoretical upper limit at which information can travel (speed of light). A very large "brain" will eventually be limited by that speed.
2) Some computational problems are so hard that even an extremely powerful "brain" would take very long to solve (http://en.wikipedia.org/wiki/Computational_complexity_theory#Intractability).
3) There are physical limits to computation (http://en.wikipedia.org/wiki/Bremermann%27s_limit). Bremermann's Limit is the maximum computational speed of a self-contained system in the material universe. According to this limit, a computer the size of the Earth would take 10^72 years to crack a 512 bit key. In other words, even an AI the size of the Earth would not manage to break modern human encryption by brute-force.
More theoretical limits here: http://en.wikipedia.org/wiki/Limits_to_computation