Apteris comments on Superintelligence 7: Decisive strategic advantage - Less Wrong
You are viewing a comment permalink. View the original post to see all comments and the full post content.
You are viewing a comment permalink. View the original post to see all comments and the full post content.
Comments (58)
If someone has some money, they can invest it to get more money. Do you know what the difference is between money and intelligence that makes it plausible to expect an abrupt intelligence explosion, but reasonable to expect steady exponential growth for financial investment returns?
While not exactly investment, consider the case of an AI competing with a human to devise a progressively better high-frequency trading strategy. An AI would probably:
I expect the AI's superior capacity to "drink from the fire hose" together with its faster response time to yield a higher exponent for the growth function than that resulting from the human's iterative improvement.
A more realistic example would be "competing with a human teamed up with a narrow AI".
You're right, that is more realistic. Even so, I get the feeling that the human would have less and less to do as time goes on. I quote:
As another data point, a recent chess contest between a chess grandmaster (Daniel Naroditsky) working together with an older AI (Rybka, rated ~3050) and the current best chess AI (Stockfish 5, rated 3290) ended with a 3.5 - 0.5 win for Stockfish.
I don't think an article which compares a hedge fund's returns to the Dow (a price-weighted index of about 30 stocks!) can be considered very credible. And there are fewer Quant funds, managing less money, than there were 7 years ago.