taw comments on The Importance of Self-Doubt - Less Wrong

23 Post author: multifoliaterose 19 August 2010 10:47PM

You are viewing a comment permalink. View the original post to see all comments and the full post content.

Comments (726)

You are viewing a single comment's thread. Show more comments above.

Comment author: taw 21 August 2010 02:34:24AM 2 points [-]

reduce you 1/100000 figure, esp. if you take only the leaders of the said movement

I already did, there was a huge number of such movements, most of them highly obscure (not unlike Eliezer). I'd expect some power law distribution in prominence, so for every one we've heard about there'd be far more we didn't.

I think that if you accept that AGI is "near", that FAI is important to try in order to prevent it

I don't, and the link from AGI to FAI is as weak as from oil production statistics to civilizational collapse peakoilers promised.

Comment author: xamdam 22 August 2010 01:21:36AM 0 points [-]

Ok, thinking how close we are to AGI is a prior I do not care to argue about, but don't you think AGI is a concern? What do you mean by a weak link?

Comment author: taw 22 August 2010 04:08:08AM 0 points [-]

What do you mean by a weak link?

The part where development of AGI fooms immediately into superintelligence and destroys the world. Evidence for it in not even circumstantial, it is fictional.

Comment author: xamdam 22 August 2010 02:05:23PM 1 point [-]

Ok, of course it's fictional - hasn't happened yet!

Still, when I imagine something that is smarter than man who created it, it seems it would be able to improve itself.I would bet on that; I do not see a strong reason why this would not happen. What about you? Are you with Hanson on this one?