Thank you For Your Participation
I would like to thank you all for your unwitting and unwilling participation in my little social experiment. If I do say so myself you all performed as I had hoped. I found some of the responses interesting, many them are goofy. I was honestly hoping that a budding rationalist community like this one would have stopped this experiment midway but I thank you all for not being that rational. I really did appreciate all the mormon2 bashing it was quite amusing and some of the attempts to discredit me were humorous though unsuccessful. In terms of the questions I asked I was curious about the answers though I did not expect to get any nor do I really need them; since I have a good idea of what the answers are just from simple deductive reasoning. I really do hope EY is working on FAI and actually is able to do it though I certainly will not stake my hopes or money on it.
Less there be any suspicion I am being sincere here.
Response
Because I can I am going to make one final response to this thread I started:
Since none of you understand what I am doing I will spell it out for you. My posts are formatted, written and styled intentionally for the response I desire. The point is to give you guys easy ways to avoid answering my questions (things like tone of the post, spelling, grammar, being "hostile (not really)" etc.). I just wanted to see if anyone here could actually look past that, specifically EY, and post some honest answers to the questions (real answers again from EY not pawns on LW). Obviously this was to much to ask, since the general responses, not completely, but for the most part were copouts. I am well aware that EY probably would never answer any challenge to what he thinks, people like EY typically won't (I have dealt with many people like EY). I think the responses here speak volumes about LW and the people who post here (If you can't look past the way the content is posted then you are going to have a hard time in life since not everyone is going to meet your standards for how they speak or write). You guys may not be trying to form a cult but the way you respond to a post like this screams cultish and even a some circle-jerk mentality mixed in there.
Post
I would like to float an argument and a series of questions. Now before you guys vote me down please do me the curtsey of reading the post. I am also aware that some and maybe even many of you think that I am a troll just out to bash SIAI and Eliezer, that is in fact not my intent. This group is supposed to be about improving rationality so lets improve our rationality.
SIAI has the goal of raising awareness of the dangers of AI as well as trying to create their own FAI solution to the problem. This task has fallen to Eliezer as the paid researcher working on FAI. What I would like to point out is a bit of a disconnect between what SIAI is supposed to be doing and what EY is doing.
According to EY FAI is an extremely important problem that must be solved with global implications. It is both a hard math problem and a problem that needs to be solved by people who take FAI seriously first. To that end SIAI was started with EY as an AI researcher at SIAI.
Until about 2006 EY was working on papers like CEV and working on designs for FAI which he has now discarded as being wrong for the most part. He then went on a long period of blogging on Overcoming Bias and LessWrong and is now working on a book on rationality as his stated main focus. If this be accurate I would ask how does this make sense from someone who has made such a big deal about FAI, its importance, being first to make AI and ensure it is FAI? If FAI is so important then where does a book on rationality fit? Does that even play into SIAI's chief goals? SIAI spends huge amounts of time talking about risks and rewards of FAI and the person who is supposed to be making the FAI is writing a book on rationality instead of solving FAI. How does this square with being paid to research FAI? How can one justify EY's reasons for not publishing the math of TDT, coming from someone who is committed to FAI? If one is committed to solving that hard of a problem then I would think that the publication of ones ideas on it would be a primary goal to advance the cause of FAI.
If this doesn't make sense then I would ask how rational is it to spend time helping SIAI if they are not focused on FAI? Can one justify giving to an organization like that when the chief FAI researcher is distracted by writing a book on rationality instead of solving the myriad of hard math problems that need to be solved for FAI? If this somehow makes sense then can one also state that FAI is not nearly as important as it has been made out to be since the champion of FAI feels comfortable with taking a break from solving the problem to write a book on rationality (in other words the world really isn't at stake)?
Am I off base? If this group is devoted to rationality then everyone should be subjected to rational analysis.
You @#$@! I want those hours of my life back. Although I must admit Batman and his Crazy Prepared is kinda cool and I do love later series Wesley, seriously Badass. I was also tempted to create an account and add Althalus and Talen to the Divine Date page.
What is it about that site that is such a trap? We must anticipate a serious social payoff for engaging with stories and identifying the abstractions they represent.
I had been reading for half an hour before I even noticed I was on the TvTropes page and the warning sprung to mind.
Anyone want to write a LW post about (discussing) How To Be Less Unreasonably Attracted To TV Tropes And Other Such Sites? It's certainly a matter of rationality-considered-as-winning, since I've been losing somewhat as a result of wasting time on TV Tropes / Wikipedia / Everything2 / reading the archives of new-to-me webcomics.