timtyler comments on Ben Goertzel: The Singularity Institute's Scary Idea (and Why I Don't Buy It) - Less Wrong
You are viewing a comment permalink. View the original post to see all comments and the full post content.
You are viewing a comment permalink. View the original post to see all comments and the full post content.
Comments (432)
This comment will be downvoted but I hope you people will actually explain yourself and not just click 'Vote down', every bot can do that.
Now that I've slept I read your comment again and I don't see any justification for why it got upvoted even once. I never claimed that EY can't ask for money, you are creating a straw man there. You also do not know what I do expect from other organisations. Further, it is not fallacious to suspect that Yudkowsky has some responsibility if people get nighmares from ideas that he would be able to resolve. If he really believes those things, it is of course his right to proclaim them. But the gist of my comment was meant to inquire about the foundations of those beliefs and stating that it does not appear to me that they are based on evidence which makes it legally right but ethically irresponsible to tell people to worry to such an extent or even not to tell them not to worry.
I just don't know how to parse this. I mean what I asked for and I do not ask for certainty here. I'm not doubting evolution and climate change. The problem is that even a randomly picked research paper likely bears more analysis, evidence and references than all of LW and the SIAI' documents together regarding risks posed by recursive self-improvement from artificial general intelligence.
The quotes have been relevant as they showed that Yudkowsky clearly believes in his intellectual and epistemic superiority, yet any corroborative evidence seems to be missing. Yes, there is this huge amount of writings on rationality and some miscellaneous musing on artificial intelligence. But given how the idea of risks from AGI is weighted by him, it is just the cherry on top of marginal issues that do not support the conclusions.
I don't have a difficulty to comprehend them either. I'm questioning the propositions, the conclusions drawn and further speculations based on those premises.
This is ridiculous. I never said you are forced to explain yourself. You are forced to explain yourself if you want people like me to take you serious.
Yudkowsky is definitely a clever fellow. He may not have fancy qualifications - and he is far from infallible - but he is pretty smart.
In the particular post in question, I am pretty sure he was being silly - which is a rather unfortunate time to be claiming superiority.
However, I don't really know. The stunt created intrigue, mystery, the forbidden, added to the controversy. Overall, Yudkowsky is pretty good at marketing - and maybe this was a taste of it.
I wonder if his Harry Potter fan-fic is marketing - or else how he justifies it.