katydee comments on The Singularity Institute's Arrogance Problem - Less Wrong Discussion
You are viewing a comment permalink. View the original post to see all comments and the full post content.
You are viewing a comment permalink. View the original post to see all comments and the full post content.
Comments (307)
Thought experiment
If the SIAI was a group of self interested/self deceiving individuals, similar to new age groups, who had made up all this stuff about rationality and FAI as a cover for fundraising what different observations would we expect?
I would expect them to:
SIAI does not appear to fit 1 (I'm not sure what the standard is here), certainly does not fit 2 or 3, debatably fits 4, and certainly does not fit 5 or 6. 7 is highly debatable but I would argue that the Sequences and other rationality material are clearly valuable, if somewhat obtuse.
That goes for self interested individuals with high rationality, purely material goals, and very low self deception. The self deceived case, on the other hand, is the people whose self interest includes 'feeling important' and 'believing oneself to be awesome' and perhaps even 'taking a shot at becoming the saviour of mankind'. In that case you should expect them to see awesomeness in anything that might possibly be awesome (various philosophy, various confused texts that might be becoming mainstream for all we know, you get the idea), combined with absence of anything that is definitely awesome and can't be trivial (a new algorithmic solution to long standing well known problem that others worked on, practically important enough, etc).