V_V comments on xkcd on the AI box experiment - Less Wrong Discussion
You are viewing a comment permalink. View the original post to see all comments and the full post content.
You are viewing a comment permalink. View the original post to see all comments and the full post content.
Comments (229)
C => A might be also true to some extent, although it is hard to tell given that RationalWiki misrepresent lots of things even when good primary sources are available.
My point however was that even if EY might be epistemically right about A, C implies that he has no moral high ground to complain about people possibly misrepresenting the basilisk after learning about it from a biased secondary source.
That something has a casual influence on something else doesn't mean that doing the first eliminates moral high ground to complain about the second.
EY bears part of the responsibility for people learning about the basilisk from RationalWiki, since due to his censorship, they can't (couldn't?) learn about it from LessWrong, where the primary source would have been available.