juliawise comments on Article about LW: Faith, Hope, and Singularity: Entering the Matrix with New York’s Futurist Set - Less Wrong

31 Post author: malo 25 July 2012 07:28PM

You are viewing a comment permalink. View the original post to see all comments and the full post content.

Comments (231)

You are viewing a single comment's thread. Show more comments above.

Comment author: juliawise 27 July 2012 06:11:11PM 13 points [-]

When you decide to fund research, what are your requirements for researchers' personal lives? Is the problem that his sex life is unusual, or that he talks about it?

Comment author: Bugmaster 27 July 2012 10:20:33PM 18 points [-]

My feelings on the topic are similar to iceman's, though possibly for slightly different reasons.

What bothers me is not the fact that Eliezer's sex life is "unusual", or that he talks about it, but that he talks about it in his capacity as the chief figurehead and PR representative for his organization. This signals a certain lack of focus due to an inability to distinguish one's personal and professional life.

Unless the precise number and configuration of Eliezer's significant others is directly applicable to AI risk reduction, there's simply no need to discuss it in his official capacity. It's unprofessional and distracting.

(in the interests of full disclosure, I should mention that I am not planning on donating to SIAI any time soon, so my points above are more or less academic).

Comment author: iceman 27 July 2012 09:19:12PM 21 points [-]

My biggest problem is more that he talks about it, sometimes in semiofficial channels. This doesn't mean that I wouldn't be squicked out if I learned about it, but I wouldn't see it as a political problem for the SIAI.

The SIAI isn't some random research think tank: it presents itself as the charity with the highest utility per marginal dollar. Likewise, Eliezer Yudkowsky isn't some random anonymous researcher: he is the public face of the SIAI. His actions and public behavior reflect on the SIAI whether or not it's fair, and everyone involved should have already had that as a strongly held prior.

If people ignore lesswrong or don't donate to the SIAI because they're filtered out by squickish feelings, then this is less resources for the SIAI's mission in return for inconsequential short term gains realized mostly by SIAI insiders. Compound this that talking about the singularity already triggers some people's absurdity bias; there needs to be as few other filters as possible to maximize usable resources that the SIAI has to maximize the chance of positive singularity outcomes.

Comment author: juliawise 27 July 2012 09:48:21PM 1 point [-]

It seems there are two problems: you trust SIAI less, and you worry that others will trust it less. I understand the reason for the second worry, but not the first. Is it that you worry your investment will become worth less because others won't want to fund SIAI?

Comment author: private_messaging 28 July 2012 07:06:15PM *  9 points [-]

That talk was very strong evidence that the SI is incompetent at PR, and furthermore, irrational. edit: or doesn't possess stated goals and beliefs. If you believe the donations are important for saving your life (along with everyone else's), then you naturally try to avoid making such statements. Though I do in some way admire straight up in your face honesty.