SarahC comments on Existential Risk and Public Relations - Less Wrong
You are viewing a comment permalink. View the original post to see all comments and the full post content.
You are viewing a comment permalink. View the original post to see all comments and the full post content.
Comments (613)
I am a relative newbie commenter here, and my interest in this site has so far been limited to using it as a fun forum where it's possible to discuss all kinds of sundry topics with exceptionally smart people. However, I have read a large part of the background sequences, and I'm familiar with the main issues of concern here, so even though it might sound impertinent coming from someone without any status in this community, I can't resist commenting on this article.
To put it bluntly, I think the main point of the article is, if anything, an understatement. Let me speak from personal experience. From the perspective of this community, I am a sort of person who should be exceptionally easy to get interested and won over to its cause, considering both my intellectual background and my extreme openness to contrarian viewpoints and skepticism towards the official academic respectability as a criteron of truth and intellectual soundness. Yet, to be honest, even though I find a lot of the writing and discussion here extremely interesting, and the writings of Yudkowsky (in addition to others such as Bostrom, Hanson, etc.) have convinced me that technology-related existential risks should be taken much more seriously than they presently are, I still keep encountering things in this community that set off various red flags, which are undoubtedly taken by many people as a sign of weirdness and crackpottery, and thus alienate huge numbers of potential quality audience.
Probably the worst such example I've seen was the recent disturbance in which Roko was subjected to abuse that made him leave. When I read the subsequent discussions, it surprised me that virtually nobody here appears to be aware what an extreme PR disaster it was. Honestly, for someone unfamiliar with this website who has read about that episode, it would be irrational not to conclude that there's some loony cult thing going on here, unless he's also presented with enormous amounts of evidence to the contrary in the form of a selection of the best stuff that this site has to offer. After these events, I myself wondered whether I want to be associated with an outlet where such things happen, even just as an occasional commenter. (And not to even mention that Roko's departure is an enormous PR loss in its own right, in that he was one of the few people here who know how to write in a way that's interesting and appealing to people who aren't hard-core insiders.)
Even besides this major PR fail, I see many statements and arguments here that may be true, or at least not outright unreasonable, but should definitely be worded more cautiously and diplomatically if they're given openly for the whole world to see. I'm not going to get into details of concrete examples -- in particular, I do not concur unconditionally with any of the specific complaints from the above article -- but I really can't help but conclude that lots of people here, including some of the most prominent individuals, seem oblivious as to how broader audiences, even all kinds of very smart, knowledgeable, and open-minded people, will perceive what they write and say. If you want to have a closed inner circle where specific background knowledge and attitudes can be presumed, that's fine -- but if you set up a large website attracting lots of visitors and participants to propagate your ideas, you have to follow sound PR principles, or otherwise its effect may well end up being counter-productive.
Agreed.
One good sign here is that LW, unlike most other non-mainstream organizations, doesn't really function like a cult. Once one person starts being critical, critics start coming out of the woodwork. I have my doubts about this place sometimes too, but it has a high density of knowledgeable and open-minded people, and I think it has a better chance than anyone of actually acknowledging and benefiting from criticism.
I've tended to overlook the weirder stuff around here, like the Roko feud -- it got filed under "That's confusing and doesn't make sense" rather than "That's an outrage." But maybe it would be more constructive to change that attitude.
Singularitirianism, transumanism, cryonics, etc probably qualify as cults under at least some of the meanings of the term: http://en.wikipedia.org/wiki/Cult Cults do not necessarily lack critics.
The wikipedia page on Cult Checklists includes seven independent sets of criteria for cult classification, provided by anti-cult activists who have strong incentives to cast as wide a net as possible. Singularitarianism, transhumanism, and cryonics fit none of those of lists. In most cases, it isn't even close.
I disagree with your assessment. Let's just look at Lw for starters.
Eileen Barker:
Based on that, I think Eileen Barker's list would have us believe Lw is a likely cult.
Shirley Harrison:
Based on that, I think Shirley Harrison's list would have us believe Lw is a likely cult.
Similar analysis using the other lists is left as an exercise for the reader.
That was... surprisingly surprising. Thank you.
For reasons like those you listed, and also out of some unverbalized frustration, in the last week I've been thinking pretty seriously whether I should leave LW and start hanging out somewhere else online. I'm not really interested in the Singularity, existential risks, cognitive biases, cryonics, un/Friendly AI, quantum physics or even decision theory. But I do like the quality of discussions here sometimes, and the mathematical interests of LW overlap a little with mine: people around here enjoy game theory and computability theory, though sadly not nearly as much as I do.
What other places on the Net are there for someone like me? Hacker News and Reddit look like dumbed-down versions of LW, so let's not talk about those. I solved a good bit of Project Euler once, the place is tremendously enjoyable but quite narrow-focused. The n-Category Cafe is, sadly, coming to a halt. Math Overflow looks wonderful and this question by Scott Aaronson nearly convinced me to drop everything and move there permanently. The Polymath blog is another fascinating place that is so high above LW that I feel completely underqualified to join. Unfortunately, none of these are really conducive to posting new results, and moving into academia IRL is not something I'd like to do (I've been there, thanks).
Any other links? Any advice? And please, please, nobody take this comment as a denigration of LW or a foot-stomping threat. I love you all.
My new blog "Azimuth" may not be mathy enough for you, but if you like the n-Category Cafe, it's possible you may like this one too. It's more focused on technology, environmental issues, and the future. Someday soon you'll see an interview with Eliezer! And at some point we'll probably get into decision theory as applied to real-world problems. We haven't yet.
(I don't think the n-Category Cafe is "coming to a halt", just slowing down - my change in interests means I'm posting a lot less there, and Urs Schreiber is spending most of his time developing the nLab.)
Wow.
Hello.
I didn't expect that. It feels like summoning Gauss, or something.
Thank you a lot for twf!
Link to John Baez's blog
The markup syntax here is a bit unusual and annoying - click the "Help" button at the bottom right of the edit window to get guidance on how to include hyperlinks. Unlike every other hyperlinking system, the text goes first and the URL second!
Make a top level post about the kind of thing you want to talk about. It doesn't have to be an essay, it could just be a question ("Ask Less Wrong") or a suggested topic of conversation.
I love your posts, so having seen this comment I'm going to try to write up my nascent sequence on memetic colds, aka sucker shoots, just for you. (And everyone.)
Thanks!
Same for me. My interests are more similar to your interests than to classic LW themes. There are probably many others here in the same situation. But I hope that the list of classic LW themes is not set in stone. I think people like us should try to broaden the spectrum of LW. If this attempt fails, please send me the address of the new place where you hang out online. :) But I am optimistic.
"Leaving" LW is rather strong. Would that mean not posting? Not reading the posts, or the comments? Or just reading at a low enough frequency that you decouple your sense of identity from LW?
I've been trying to decide how best to pump new life into The Octagon section of the webcomic collective forum Koala Wallop. The Octagon started off when Dresden Codak was there, and became the place for intellectual discussion and debate. The density of math and computer theoretic enthusiasts is an order of magnitude lower than here or the other places you mentioned, and those who know such stuff well are LW lurkers or posters too. There was an overkill of politics on The Octagon, the levels of expertise on subjects are all over the spectrum, and it's been slowing down for a while, but I think a good push will revive it. The main thing is that it lives inside of a larger forum, which is a silly, fun sort of community. The subforum simply has a life of it's own.
Not that I claim any ownership over it, but:
I'm going to try to more clearly brand it as "A friendly place to analytically discuss fantastic, strange or bizarre ideas."
Of course, MathOverflow isn't really a place for discussion...
At least as far as math is concerned, people not in academia can publish papers. As for the Polymath blog, I'd actually estimate that you are at about the level of most Polymath contributors, although most of the impressive work there seems to be done by a small fraction of the people there.
About Polymath: thanks! (blushes)
I have no fetish for publishing papers or having an impressive CV or whatever. The important things, for me, are these: I want to have meaningful discussions about my areas of interest, and I want my results to be useful to somebody. I have received more than a fair share of "thank yous" here on LW for clearing up mathy stuff, but it feels like I could be more useful... somewhere.
I found this amusing because by those standards, cults are everywhere. For example, I run a professional Magic: The Gathering team and am pretty sure I'm not a cult leader. Although that does sound kind of neat. Observe:
Eileen Barker: 1. When events are close we spend a lot of time socially seperate from others so as to develop and protect our research. On occasion 'Magic colonies' form for a few weeks. It's not substantially less isolating than what SIAI dos. Check. 2. I have imparted huge amounts of belief about a large subset of our world, albeit a smaller one than Eliezer is working on. Partial Check. 3. I make reasonably import, on the level of the Cryonics decision if Cryonics isn't worthwhile, decisions for my teammates and do what I need to do to make sure they follow them far more than they would without me. Check. 4. We identify other teams as 'them' reasonably often, and certain other groups are certainly viewed as the enemy. Check. 5. Nope, even fainter argument than Eliezer. 6. Again, yes, obviously.
Shirley Harrison: 1. I claim a special mission that I am uniquely qualified to fufill. Not as important of one, but still. Check. 2. My writings count at least as much as the sequences. Check. 3. Not intentionally, but often new recruits have little idea what to expect. Check plus. 4. Totalitarian rules structure, and those who game too much often alienate friends and family. I've seen it many times, and far less of a cheat than saying that you'll be alienated from them when they are all dead and you're not because you got frozen. Check. 5. I make people believe what I want with the exact same techniques we use here. If anything, I'm willing to use slightly darker arts. Check. 6. We make the lower level people do the grunt work, sure. Check. 7. Based on some of the deals I've made, one looking to demonize could make a weak claim. Check plus. 8. Exclusivity. In spades. Check.
I'd also note that the exercise left to the reader is much harder, because the other checklists are far harder to fudge.
On Eileen Barker:
I believe that most LW posters are not signed up for cryonics (myself included), and there is substantial disagreement about whether it's a good idea. And that disagreement has been well received by the "cult", judging by the karma scores involved.
Theism has been discussed. It is wrong. But Robert Aumann's work is still considered very important; theists are hardly dismissed as "satanic," to use Barker's word.
Of Barker's criteria, 2-4 of 6 apply to the LessWrong community, and only one ("Leaders and movements who are unequivocally focused on achieving a certain goal") applies strongly.
On Shirley Harrison:
I can't speak for Eliezer, but I suspect that if there were a person who was obviously more qualified than him to tackle some aspect of FAI, he would acknowledge it and welcome their contributions.
No. The sequences are not infallible, they have never been claimed as such, and intelligent disagreement is generally well received.
What you describe is a prosperous exaggeration, not "[t]otalitarianism and alienation of members from their families and/or friends."
Any person who promotes a charity at which they work is pushing a cult, by this interpretation. Eliezer isn't "lining his own pockets"; if someone digs up the numbers, I'll donate $50 to a charity of your choice if it turns out that SIAI pays him a salary disproportionally greater (2 sigmas?) than the average for researchers at comparable non-profits.
So that's 2-6 of Harrison's checklist items for LessWrong, none of them particularly strong.
My filters would drop LessWrong in the "probably not a cult" category, based off of those two standards.
Eliezer was compensated $88,610 in 2008 according to the Form 990 filed with the IRS and which I downloaded from GuideStar.
Wikipedia tells me that the median 2009 income in Redwood where Eliezer lives is $69,000.
(If you are curious, Tyler Emerson in Sunnyvale (median income 88.2k) makes 60k; Susan Fonseca-Klein also in Redwood was paid 37k. Total employee expenses is 200k, but the three salaries are 185k; I don't know what accounts for the difference. The form doesn't seem to say.)
In particular, there seems to be a lot of disagreement about the metaethics sequence, and to a lesser extent about timeless physics.
What exactly are Eliezer's qualifications supposed to be?
You mean, "What are Eliezer's qualifications?" Phrasing it that way makes it sound like a rhetorical attack rather than a question.
To answer the question itself: lots of time spent thinking and writing about it, and some influential publications on the subject.
I'm definitely not trying to attack anyone (and you're right my comment could be read that way). But I'm also not just curious. I figured this was the answer. Lots of time spent thinking, writing and producing influential publications on FAI is about all the qualifications one can reasonably expect (producing a provable mathematical formalization of friendliness is the kind of thing no one is qualified to do before they do it and the AI field in general is relatively new and small). And Eliezer is obviously a really smart guy. He's probably even the most likely person to solve it. But the effort to address the friendliness issue seems way too focused on him and the people around him. You shouldn't expect any one person to solve a Hard problem. Insight isn't that predictable especially when no one in the field has solved comparable problems before. Maybe Einstein was the best bet to formulate a unified field theory but a) he never did and b) he had actually had comparable insights in the past. Part of the focus on Eliezer is just an institutional and financial thing, but he and a lot of people here seem to encourage this state of affairs.
No one looks at open problems in other fields this way.
Yes, the situation isn't normal or good. But this isn't a balanced comparison, since we don't currently have a field, too few people understand the problem and had seriously thought about it. This gradually changes, and I expect will be visibly less of a problem in another 10 years.
I haven't seen any proves of his math skills that would justify this statement. By what evidence have you arrived at the conclusion that he can do it at all, even approach it? The sequences and the SIAI publications certainly show that he was able to compile a bunch of existing ideas into a coherent framework of rationality, yet there is not much novelty to be found anywhere.
Great comment.
How influential are his publications if they could not convince Ben Goertzel (SIAI/AGI researcher), someone who has read Yudkowsky's publications and all of the LW sequences? You could argue that he and other people don't have the smarts to grasp Yudkowsky's arguments, but who does? Either Yudkowsky is so smart that some academics are unable to appreciate his work or there is another problem. How are we, we who are far below his level, supposed to evaluate if we should believe what Yudkowsky says if we are neither smart enough to do so nor able to subject his work to empirical criticism?
The problem here is that telling someone that Yudkowsky spent a lot of time thinking and writing about something is not a qualification. Further it does not guarantee that he would acknowledge and welcome the contributions of others who disagree.
The motivated cognition here is pretty thick. Writing is influential when many people are influenced by it. It doesn't have to be free of people who disagree with it to be influential, and it doesn't even have to be correct.
Level up first. I can't evaluate physics research, so I just accept that I can't tell which of it is correct; I don't try to figure it out from the politics of physicists arguing with each other, because that doesn't work.
Ben Goertzel believes in psychic phenomenon (see here for details), so his failure to be convinced by Eliezer is not strong evidence against the correctness of Eliezer's stance.
For what it's worth, Eliezer has been influential/persuasive enough to get the SIAI created and funded despite having absolutely no academic qualifications. He's also responsible for coining "Seed AI".
I have to disagree that this "smugness" even remotely reaches the level that is characteristic of a cult.
As someone who has frequently expressed disagreement with the "doctrine" here, I have occasionally encountered both reactions that you mention. But those sporadic reactions are not much of a barrier to criticism - any critic who persists here will eventually be engaged intelligently and respectfully, assuming that the critic tries to achieve a modicum of respect and intelligence on his own part. Furthermore, if the critic really engages with what his interlocutors here are saying, he will receive enough upvotes to more than repair the initial damage to his karma
Yes. LessWrong is not in fact hidebound by groupthink. I have lots of disagreement with the standard LessWrong belief cluster, but I get upvotes if I bother to write well, explain my objections clearly and show with my reference links that I have some understanding of what I'm objecting to. So the moderation system - "vote up things you want more of" - works really well, and I like the comments here.
This has also helped me control my unfortunate case of asshole personality disorder elsewhere I see someone being wrong on the Internet. It's amazing what you can get away with if you show your references.
This would be easier to parse if you quoted the individual criteria you are evaluating right before the evaluation, eg:
I've not seen this happening - examples?
I think it would be more accurate to say that anyone who after reading the sequences still disagrees, but is unable to explain where they believe the sequences have gone wrong, is not worth arguing with.
With this qualification, it no longer seems like evidence of being cult.
That's the pejorative usage. There is also:
"Cult also commonly refers to highly devoted groups, as in:
Cult, a cohesive group of people devoted to beliefs or practices that the surrounding culture or society considers to be outside the mainstream
http://en.wikipedia.org/wiki/Cults_of_personality
http://en.wikipedia.org/wiki/Cult_following
http://en.wikipedia.org/wiki/Cult_%28religious_practice%29