XiXiDu comments on Transparency and Accountability - Less Wrong

16 Post author: multifoliaterose 21 August 2010 01:01PM

You are viewing a comment permalink. View the original post to see all comments and the full post content.

Comments (141)

You are viewing a single comment's thread. Show more comments above.

Comment author: Airedale 21 August 2010 04:32:09PM 28 points [-]

Your posts on SIAI have had a veneer of evenhandedness and fairness, and that continues here. But given what you don’t say in your posts, I cannot avoid the impression that you started out with the belief that SIAI was not a credible charity and rather than investigating the evidence both for and against that belief, you have marshaled the strongest arguments against donating to SIAI and ignored any evidence in favor of donating to SIAI. I almost hesitate to link to EY lest you dismiss me as one of his acolytes, but see, for example, A Rational Argument.

In your top-level posts you have eschewed references to any of the publicly visible work that SIAI does such as the Summit and the presentation and publication of academic papers. Some of this work is described at this link to SIAI’s description of its 2009 achievements. The 2010 Summit is described here. As for Eliezer’s current project, at the 2009 achievements link, SIAI has publicized the fact that he is working on a book on rationality:

Yudkowsky is now converting his blog sequences into the planned rationality book, which he hopes will significantly assist in attracting and inspiring talented individuals to effectively work towards the aims of a beneficial Singularity and reduced existential risk.

You could have chosen to make part of your evaluation of SIAI an analysis of whether or not EY’s book will ultimately be successful in this goal or whether it’s the most valuable work that EY should be doing to reduce existential risk, but I’m not sure how his work on transforming the fully public LW sequences into a book is insufficiently transparent or not something for which he and SIAI can be held accountable when it is published.

Moreover, despite your professed interest in existential risk reduction and references in others’ comments to your posts about the Future of Humanity Institute at Oxford, you suggest donating to Givewell-endorsed charities as an alternative to SIAI donations without even a mention of FHI as a possible alternative in the field of existential risk reduction. Perhaps you find FHI equally non-credible/non-accountable as a charity, but whatever FHI’s failings, it’s hard to see how they are exactly the same ones which you have ascribed to SIAI. Perhaps you believe that if a charity has not been evaluated and endorsed by Givewell, it can’t possibly be worthwhile. I can’t avoid the thought that if you were really interested in existential risk reduction, you would spend at least some tiny percentage of the time you’ve spent writing up these posts against SIAI on investigating FHI as an alternative.

I would be happy to engage with you or others on the site in a fair and unbiased examination of the case for and against SIAI (and/or FHI, the Foresight Institute, the Lifeboat Foundation, etc.). Although I may come across as strongly biased in favor of SIAI in this comment, I have my own concerns about SIAI’s accountability and public relations, and have had numerous conversations with those within the organization about those concerns. But with limited time on my hands and faced with such a one-sided and at times even polemical presentation from you, I find myself almost forced into the role of SIAI defender, so that I can least provide some of the positive information about SIAI that you leave out.

Comment author: XiXiDu 21 August 2010 05:47:46PM 4 points [-]

I cannot avoid the impression that you started out with the belief that SIAI was not a credible charity and rather than investigating the evidence both for and against that belief, you have marshaled the strongest arguments against donating to SIAI and ignored any evidence in favor of donating to SIAI.

"If you’re interested in being on the right side of disputes, you will refute your opponents’ arguments. But if you’re interested in producing truth, you will fix your opponents’ arguments for them. To win, you must fight not only the creature you encounter; you must fight the most horrible thing that can be constructed from its corpse." -- Black Belt Bayesian

If multifoliaterose took the position of a advocatus diaboli, what would be wrong with that?

Comment author: Airedale 21 August 2010 07:06:28PM 8 points [-]

Although I always love a good quote from Black Belt Bayesian (a/k/a steven0461 a/k/a my husband), I think he’s on board with my interpretation of multifoliaterose’s posts. (At least, he’d better be!)

Going on to the substance, it doesn’t seem that multifoliaterose is just playing devil’s advocate here rather than arguing his actual beliefs – indeed everything he’s written suggests that he’s doing the latter. Beyond that, there may be a place for devil’s advocacy (so long as it doesn’t cross the line into mere trolling, which multifoliaterose’s posts certainly do not) at LW. But I think that most aspiring rationalists (myself included) should still try to evaluate evidence for and against some position, and only tread into devil’s advocacy with extreme caution, since it is a form of argument where it is all too easy to lose sight of the ultimate goal of weighing the available evidence accurately.

Comment author: XiXiDu 21 August 2010 07:47:34PM 4 points [-]

Although I always love a good quote from Black Belt Bayesian (a/k/a steven0461 a/k/a my husband)

Wow, I managed to walk into the lion's den there!

Going on to the substance, it doesn’t seem that multifoliaterose is just playing devil’s advocate here...

Yeah, I wasn't actually thinking that to be the case either. But since nobody else seems to be following your husbands advice...at least someone tries to argue against the SIAI. Good criticism can be a good thing.

...and only tread into devil’s advocacy with extreme caution...

I see, I'll take your word for it. I haven't thought about it too much. So far I thought your husbands quote is universally applicable.

Comment author: wedrifid 21 August 2010 10:08:49PM *  1 point [-]

If multifoliaterose took the position of a advocatus diaboli, what would be wrong with that?

Multi has already refuted the opponent's arguments, well at least Eliezer more or less refuted them for him. Now it is time to do just what Black Belt Bayesian suggested and try to fix the SIAI's arguments for them. Because advocacy - including devil's advocacy - is mostly bullshit.

Remind SIAI of what they are clearly doing right and also just what a good presentation of their strengths would look like - who knows, maybe it'll spur them on and achieve in some measure just the kind of changes you desire!

Comment author: XiXiDu 22 August 2010 12:05:11PM *  4 points [-]

Interesting! Levels of epistemic accuracy:

So while telling the truth is maximally accurate relative to your epistemic state, concealment is deception by misguidance which is worse than the purest form of deception that is lying (falsehood). Bullshit however is not even wrong.

I don't see how devil's advocacy fits into this as I perceive it to be a temporary adjustment of someones mental angel to look back at one's own position from a different point of view.