timtyler comments on Hanson Debating Yudkowsky, Jun 2011 - Less Wrong

14 Post author: XiXiDu 03 July 2011 04:59PM

You are viewing a comment permalink. View the original post to see all comments and the full post content.

Comments (78)

You are viewing a single comment's thread.

Comment author: timtyler 06 July 2011 06:32:40PM *  0 points [-]

It's not a brain in a box in a basement - and it's not one grand architectural insight - but I think the NSA shows how a secretive organisation can get ahead and stay ahead - if it is big and well funded enough. Otherwise, public collaboration tends to get ahead and stay ahead, along similar lines to those Robin mentions.

Google, Apple, Facebook etc. are less-extreme versions of this kind of thing, in that they keep trade secrets which give them advantages - and don't contribute all of these back to the global ecosystem. As a result they gradually stack up know-how that others lack. If they can get enough of that, then they will gradually pull ahead - if they are left to their own devices.

The issue of whether a company will eventually pull ahead is an issue which has quite a bit to do with anti-trust legislation - as I discuss in One Big Organism.

The issue of whether one government will eventually pull ahead is a bit different. There's no government-level anti-trust legislation. However expansionist governments are globally frowned upon.

I don't think there are too many other significant players besides companies and governments.

The "silver bullet" idea doesn't seem to be worth too much. As Eray says: "Every algorithm encodes a bit of intelligence". We know that advanced intelligence necessarily highly complex. You can't predict a complex world without being that complex yourself. Of course, human intelligence might be relatively simple - in which case it might only take a few leaps to get to it. The history of machine intelligence fairly strongly suggests a long gradual slog to me - but it is at least possible to argue that people have been doing it all wrong so far.