You're looking at Less Wrong's discussion board. This includes all posts, including those that haven't been promoted to the front page yet. For more information, see About Less Wrong.

John_Maxwell_IV comments on Downvote stalkers: Driving members away from the LessWrong community? - Less Wrong Discussion

39 Post author: Ander 02 July 2014 12:40AM

You are viewing a comment permalink. View the original post to see all comments and the full post content.

Comments (128)

You are viewing a single comment's thread.

Comment author: John_Maxwell_IV 02 July 2014 12:56:37AM 10 points [-]

I have been told by multiple other people at LessWrong meetings things such as “I used to post a lot on LessWrong, but then I posted X, and got mass downvoted, so now I only comment on Yvain’s blog”.

That's interesting, and is causing me to update in the direction of thinking that this is a real problem that resources should be devoted to solving. I think I know of one other person who's not you who has left LW because of downvoting. It's interesting how seriously we take the arbitrary numbers associated with our profiles & contributions. (I do it too.)

And it looks as though there are many people who have reported similar in this thread. Maybe talk to Kaj Sotala? Perhaps he is privately reprimanding mass downvoters?

I do think this comment of yours was a reasonable downvote candidate:

Then why do I see reddit links to NOAA articles, every single month, with titles like: "May 2014 the hottest May since 1880. Four of the five warmest Mays on record have occurred in the past five years. May 2014 marked the 39th consecutive May and 351st consecutive month (more than 29 years) with a global temperature above the 20th century average."

Not because I think you are wrong about global warming, but because frequency of newspaper headlines seems like a bad way to infer statistical trends. Newspapers report on what's interesting, what their readers will read, what's unusual, etc. So news stories are not all that representative of what's actually going on in the world.

Comment author: shminux 02 July 2014 03:01:33AM *  8 points [-]

is causing me to update in the direction of thinking that this is a real problem that resources should be devoted to solving

I don't believe that it's more than a day or two of work for a developer. The SQL queries one would run are pretty simple, as we previously discussed, and as Jack from Trike confirmed. The reason that nothing has been done about it is that Eliezer doesn't care. And he may well have good reasons not to, but he never commented on the issue, except maybe once when he mentioned something about not having technical capabilities to identify the culprits (which is no longer a valid statement).

My guess is that he cares not nearly as much about LW in general now as he used to, as most of the real work is done at MIRI behind the scenes, and this forum is mostly noise for him these days. He drops by occasionally as a distraction from important stuff, but that's it.

Comment author: John_Maxwell_IV 02 July 2014 06:34:20AM 10 points [-]

The reason that nothing has been done about it is that Eliezer doesn't care.

This sounds like moralizing to me. Of the following two scenarios, which do you have in mind?

  • Someone had an idea for a solution to the problem and ran it by Eliezer. Eliezer vetoed it (because he was feeling spiteful?)

  • Eliezer is a busy person trying to do lots of things. Because Eliezer has historically been LW's head honcho, no one feels comfortable taking decisive action without his approval. But folks are hesitant to bother him because they know he's got lots to do, or else they do send him email but he doesn't respond to them because he's behind on his email, or he skips reading those emails in favor of higher-priority emails.

I think the second scenario is far more likely. If the second scenario is the case, I don't see any reason to bother Eliezer. We just have to stop acting as though all important forum decisions must go through him. Personally I don't see any reason why Eliezer would know best how to run LW. Expertise at blogging is not the same as expertise at online community management. And empirically, there have been lots of complaints about the way LW is moderated, which is evidence that Eliezer is bad at it (I know there are other moderators, but I'm assuming he sets the tone and has the final word). My guess is that to the extent he's given deference, it's due to his high status or some kind of halo effect. (Speaking of which, the halo effect seems like a bias that LWers fall prey to really often regarding high-status LW figures like Eliezer, Lukeprog, and Matt Fallshaw. But I digress.)

I don't know if this one particular issue is worth a revolt. But if we can brainstorm enough issues that would benefit from an overhaul of the moderation/LW leadership team, perhaps it would be worthwhile to start another thread devoted to that topic.

Comment author: ChristianKl 02 July 2014 09:13:19AM 10 points [-]

Eliezer is a busy person trying to do lots of things. Because Eliezer has historically been LW's head honcho, no one feels comfortable taking decisive action without his approval.

If you are a busy person wanting to get a lot of things done, delegate tasks and give someone else the authority do solve the task. To the extend that he doesn't want to solve tasks like this himself, he should delegate the authority clearly to someone else.

Comment author: shminux 02 July 2014 07:04:23AM 8 points [-]

Of course it's the second scenario. My point is that this forum has dropped in priority for Eliezer and MIRI in general in the last year or so. And, as I said, probably for a good reason.

Comment author: ChrisHallquist 02 July 2014 05:31:11AM 7 points [-]

The reason that nothing has been done about it is that Eliezer doesn't care. And he may well have good reasons not to, but he never commented on the issue, except maybe once when he mentioned something about not having technical capabilities to identify the culprits (which is no longer a valid statement).

My guess is that he cares not nearly as much about LW in general now as he used to...

This. Eliezer clearly doesn't care about LessWrong anymore, to the point that these days he seems to post more on Facebook than on LessWrong. Realizing this is a major reason why this comment is the first anything I've posted on LessWrong in well over a month.

I know a number of people have been working on launching a LessWrong-like forum dedicated to Effective Altruism, which is supposedly going to launch very soon. Here's hoping it takes off—because honestly, I don't have much hope for LessWrong at this point.

Comment author: XiXiDu 02 July 2014 09:41:08AM *  5 points [-]

Eliezer clearly doesn't care about LessWrong anymore, to the point that these days he seems to post more on Facebook than on LessWrong.

He receives a massive number of likes there, no matter what he writes. My guess is that he needs that kind of feedback, and he doesn't get it here anymore. Recently he requested that a certain topic should not be mentioned on the HPMOR subreddit, or otherwise he would go elsewhere. On Facebook he can easily ban people who mention something he doesn't like.

Comment author: [deleted] 02 July 2014 04:22:10PM 1 point [-]

Given that you directly caused a fair portion of the thing that is causing him pain (i.e., spreading FUD about him, his orgs, and etc.), this is like a win for you, right?

Why don't you leave armchair Internet psychoanalysis to experts?

Comment author: ChrisHallquist 03 July 2014 04:13:33AM 8 points [-]

I'm not sure how to respond to this comment, given that it contains no actual statements, just rhetorical questions, but the intended message seems to be "F you for daring to cause Eliezer pain, by criticizing him and the organization he founded."

If that's the intended message, I submit that when someone is a public figure, who writes and speaks about controversial subjects and is the founder of an org that's fairly aggressive about asking people for money, they really shouldn't be insulated from criticism on the basis of their feelings.

Comment author: [deleted] 03 July 2014 12:01:09PM -2 points [-]

I'm not sure how to respond to this comment

You could have simply not responded.

If that's the intended message

It wasn't, no. It was a reminder to everyone else of XiXi's general MO, and the benefit he gets from convincing others that EY is a megalomaniac, using any means necessary.

Comment author: XiXiDu 03 July 2014 12:51:35PM 2 points [-]

It wasn't, no. It was a reminder to everyone else of XiXi's general MO...

Circa 2005 I had a link to MIRI (then called the Singularity Institute) on my homepage. Circa 2009 I've even been advertising LessWrong.

I am on record as saying that I believe most of the sequences to consist of true and sane material. I am on record as saying that I believe LessWrong to be the most rational community.

But in 2010, due to some incidence that may not be mentioned here, I noticed that there are some extreme tendencies and beliefs that might easily outweigh all the positive qualities. I also noticed that a certain subset of people seems to have a very weird attitude when it comes to criticism pertaining Yudkowsky, MIRI or LW.

I've posted a lot of arguments that were never meant to decisively refute Yudkowksky or MIRI, but to show that many of the extraordinary claims can be weakened. The important point here is that I did not even have to do this, as the burden of evidence is not on me disprove those claims, but on the people who make the claims. They need to prove that their claims are robust and not just speculations on possible bad outcomes.

Comment author: David_Gerard 03 July 2014 12:28:48PM 2 points [-]

It wasn't, no. It was a reminder to everyone else of XiXi's general MO, and the benefit he gets from convincing others that EY is a megalomaniac, using any means necessary.

You keep saying this and things like it, and not providing any evidence whatsoever when asked, directly or indirectly.

Comment author: XiXiDu 03 July 2014 11:58:33AM 3 points [-]

Given that you directly caused a fair portion of the thing that is causing him pain (i.e., spreading FUD about him, his orgs, and etc.), this is like a win for you, right?

A win would be if certain people became a little less confident about the extraordinary claims he makes, and more skeptical of the mindset that CFAR spreads.

A win would be if he became more focused on exploration rather than exploitation, on increasing the robustness of his claims, rather than on taking actions in accordance with his claims.

A world in which I don't criticize MIRI is a world where they ask for money in order to research whether artificial intelligence is an existential risk, rather than asking for money to research a specific solution in order to save an intergalactic civilization.

A world in which I don't criticize Yudkowsky is a world in which he does not make claims such as that if you don’t sign up your kids for cryonics then you are a lousy parent.

A world in which I don't criticize CFAR/LW is a world in which they teach people to be extremely skeptical of back-of-the-envelope calculations, a world in which they tell people to strongly discount claims that cannot be readily tested.

Why don't you leave armchair Internet psychoanalysis to experts?

I speculate that Yudkowsky has narcissistic tendencies. Call it armchair psychoanalysis if you like, but I think there is enough evidence to warrant such speculations.

Comment author: Squark 03 July 2014 02:03:48PM -1 points [-]

I speculate that Yudkowsky has narcissistic tendencies. Call it armchair psychoanalysis if you like, but I think there is enough evidence to warrant such speculations.

I call it an ignoble personal attack which has no place on this forum.

Comment author: XiXiDu 03 July 2014 03:28:52PM *  4 points [-]

I call it an ignoble personal attack which has no place on this forum.

Sorry. It wasn't meant as an attack, just something that came to my mind reading the comment by Chris Hallquist.

My initial reply was based on the following comment by Yudkowsky:

I'm really impressed by Facebook's lovely user experience - when I get a troll comment I just click the x, block the user and it's gone without a trace and never recurs.

And regarding narcissism, the definition is: "an inflated sense of one's own importance and a deep need for admiration."

See e.g. this conversation between Ben Goertzel and Eliezer Yudkowsky (note that MIRI was formerly known as SIAI):

Striving toward total rationality and total altruism comes easily to me. […] I’ll try not to be an arrogant bastard, but I’m definitely arrogant. I’m incredibly brilliant and yes, I’m proud of it, and what’s more, I enjoy showing off and bragging about it. I don’t know if that’s who I aspire to be, but it’s surely who I am. I don’t demand that everyone acknowledge my incredible brilliance, but I’m not going to cut against the grain of my nature, either. The next time someone incredulously asks, “You think you’re so smart, huh?” I’m going to answer, “Hell yes, and I am pursuing a task appropriate to my talents.” If anyone thinks that a Friendly AI can be created by a moderately bright researcher, they have rocks in their head. This is a job for what I can only call Eliezer-class intelligence.

Also see e.g. this comment by Yudkowsky:

Unfortunately for my peace of mind and ego, people who say to me "You're the brightest person I know" are noticeably more common than people who say to me "You're the brightest person I know, and I know John Conway". Maybe someday I'll hit that level. Maybe not.

Until then... I do thank you, because when people tell me that sort of thing, it gives me the courage to keep going and keep trying to reach that higher level.

...and from his post...

When Marcello Herreshoff had known me for long enough, I asked him if he knew of anyone who struck him as substantially more natively intelligent than myself. Marcello thought for a moment and said "John Conway—I met him at a summer math camp." Darn, I thought, he thought of someone, and worse, it's some ultra-famous old guy I can't grab. I inquired how Marcello had arrived at the judgment. Marcello said, "He just struck me as having a tremendous amount of mental horsepower," and started to explain a math problem he'd had a chance to work on with Conway.

Not what I wanted to hear.

And this kind of attitude started early. See for example what he wrote in his early "biography":

I think my efforts could spell the difference between life and death for most of humanity, or even the difference between a Singularity and a lifeless, sterilized planet [...] I think that I can save the world, not just because I’m the one who happens to be making the effort, but because I’m the only one who can make the effort.

Also see this video:

So if I got hit by a meteor right now, what would happen is that Michael Vassar would take over responsibility for seeing the planet through to safety, and say ‘Yeah I’m personally just going to get this done, not going to rely on anyone else to do it for me, this is my problem, I have to handle it.’ And Marcello Herreshoff would be the one who would be tasked with recognizing another Eliezer Yudkowsky if one showed up and could take over the project, but at present I don’t know of any other person who could do that, or I’d be working with them.

Comment author: Nornagest 03 July 2014 10:32:09PM *  5 points [-]

regarding narcissism, the definition is: "an inflated sense of one's own importance and a deep need for admiration."

That's the dictionary definition. When throwing around accusations of mental pathology, though, it behooves one not to rely on pattern-matching to one-sentence definitions; it overestimates the prevalence of problems, suggests the wrong approaches to them, and tends to be considered rude.

Having a lot of ambition and an overly optimistic view of intelligence in general and one's own intelligence in particular doesn't make you a narcissist, or every fifteen-year-old nerd in the world would be a narcissist.

(That said, I'm not too impressed with Eliezer's reasons for moving to Facebook.)

Comment author: Viliam_Bur 03 July 2014 04:17:54PM *  3 points [-]

I feel that similar accusation could be used against anyone who feels that more is possible and instead of whining tries to win.

I am not an expert on narcissism (though I could be expert at it, heh), but seems to me that a typical narcissistic person would feel they deserve admiration without doing anything awesome. They probably wouldn't be able to work hard, for years. (But as I said, I am not an expert; there could be multiple types of narcissism.)

Comment author: buybuydandavis 03 July 2014 09:10:37PM 2 points [-]

I think that I can save the world, not just because I’m the one who happens to be making the effort, but because I’m the only one who can make the effort.

Thinking that one person is going to save the world, and you're him, qualifies as "an inflated sense of one's own importance", IMO.

First mistake: believing that one person will be saving the world. Second mistake: there is likely only one person that can do it, and he's that person.

“You think that you are potentially the greatest who has yet lived, the strongest servant of the Light, that no other is likely to take up your wand if you lay it down.”

Comment author: Squark 07 July 2014 06:47:08PM *  0 points [-]

Sorry. It wasn't meant as an attack, just something that came to my mind reading the comment by Chris Hallquist.

Well, I'm sorry but when you dig up quotes of your opponent to demonstrate purported flaws in his character, it is a personal attack. I didn't expect to encounter this sort of thing in LessWrong. Given the number of upvotes your comment received, I can understand why Eliezer prefers Facebook.

Comment author: XiXiDu 08 July 2014 08:30:48AM 1 point [-]

Yudkowsky tells other people to get laid. He is asking the community to downvote certain people. He is calling people permanent idiots.

He is a forum moderator. He asks people for money. He wants to create the core of the future machine dictator that is supposed to rule the universe.

Given the above, I believe that remarks about his personality are warranted, and not attacks, if they are backed up by evidence (which I provided in other comments above).

But note that in my initial comment, which got this discussion started, I merely uttered a guess on why Yudowsky might now prefer Facebook over LessWrong. Then a comment forced me to escalate this by providing further justification for uttering this guess. Your comments further forced me to explain myself. Which resulted in a whole thread about Yudkowsky's personality.

Comment author: Viliam_Bur 04 July 2014 07:21:31AM 0 points [-]

more skeptical of the mindset that CFAR spreads

Just curious: what else do you consider the big problems of CFAR (other than being associated with MIRI)?

Comment author: Ander 02 July 2014 01:26:16AM 8 points [-]

Indeed, it is perfectly fine if someone downvoted that post. I probably deserved a -3 there. However, rather than be given the opportunity to learn from that feedback in the way karma is supposed to work, I instead received one downvote to every post I ever made on the site.

Comment author: buybuydandavis 02 July 2014 03:29:32AM 1 point [-]

I don't think people are entirely on the same page about how karma is "supposed to work". For some, it may be feedback to get people to post better. For others, it may be stifling the posts from who they perceive as a low quality poster.

Karma bombing seems rather jerk faced to me, but do you really need to care? You've got enough karma to post articles. You have good evidence that the karma drop was due to one lone jerk off.

Therefore, what does he matter? Why is this a problem for you?

Comment author: shminux 02 July 2014 03:53:28AM 15 points [-]

Why is this a problem for you?

I suppose if you use comment karma to evaluate how people like what you write, blank downvoting masks the useful signal.

Comment author: bbleeker 02 July 2014 11:23:20AM 9 points [-]

Yes. A while ago I suddenly lost like 50 points (which is a lot for me). The signal that gives isn't 'don't write stuff like this', but 'we don't want you here, go away', and I almost did.

Comment author: buybuydandavis 03 July 2014 12:00:06AM 2 points [-]

But he knows the source of the karma drop, therefore the useful signal has been unmasked.

Comment author: NancyLebovitz 02 July 2014 03:37:28PM 11 points [-]

Therefore, what does he matter? Why is this a problem for you?

I don't see the point in telling people that they shouldn't have the emotional reactions that they keep having. It may be possible to fade those reactions out in the long haul, but caring about karma is a typical reaction (and it seems to be at least common), then it's better to take it into account.

Comment author: ThisSpaceAvailable 03 July 2014 02:37:28AM 1 point [-]

If you were mugged, but the cops caught the mugger and you got all your money back, would you not care about the mugging? You seem to be putting results over process.

Comment author: buybuydandavis 03 July 2014 08:36:41PM *  0 points [-]

I would want to see the guy strung up. But I wouldn't refrain from going out of my house because I had once been mugged. I consider that a dysfunctional response. If I knew someone who was "living" that way, I'd encourage them to change.

See previous comment (downvoted into oblivion) on people refraining from posting because people downvoted them. I walk the talk. It just isn't that hard.

http://lesswrong.com/lw/kfj/downvote_stalkers_driving_members_away_from_the/b255

Comment author: polymathwannabe 02 July 2014 03:32:44PM 3 points [-]

I do think this comment of yours was a reasonable downvote candidate [...] Not because I think you are wrong about global warming, but because frequency of newspaper headlines seems like a bad way to infer statistical trends.

Then taking the trouble of explaining why the comment is problematic is much more helpful to the discussion than simply clicking on the thumbdown.