XiXiDu comments on Reply to Yvain on 'The Futility of Intelligence' - Less Wrong
You are viewing a comment permalink. View the original post to see all comments and the full post content.
You are viewing a comment permalink. View the original post to see all comments and the full post content.
Comments (15)
I understand them well enough for the purpose of asking researchers a few questions. My karma score has been 5700+ at some point. Do you think that would have been possible without having a basic understanding of some of the underlying ideas?
I think this is just unfair. I do not think that my email, or the questions I asked were wrong. There is also no way to ask a lot of researchers about this topic without sounding a bit wacky.
All you could do is 1) tell them to read the Sequences 2) not to ask them at all and just trust Eliezer Yudkowsky. 1) Won't work since they have no reason to suspect that Eliezer Yudkowsky knows some incredible secret knowledge they don't. 2) Is no option for me. He could tell me anything about AI and I would have no way to tell if he knows what he is talking about.
Yes. I attribute my 18k karma to excessive participation. If I didn't have a clue what I was talking about it would have taken longer but I would have collected thousands of karma anyway just by writing many comments with correct grammar.
Karma - that is, total karma of users - means very little.
I'd kinda like to see it expressed as (total karma/total posts), that might help a little bit...