You're looking at Less Wrong's discussion board. This includes all posts, including those that haven't been promoted to the front page yet. For more information, see About Less Wrong.

Gleb_Tsipursky comments on “Be A Superdonor!”: Promoting Effective Altruism by Appealing to the Heart - Less Wrong Discussion

8 Post author: Gleb_Tsipursky 09 November 2015 06:20PM

You are viewing a comment permalink. View the original post to see all comments and the full post content.

Comments (82)

You are viewing a single comment's thread. Show more comments above.

Comment author: Gleb_Tsipursky 11 November 2015 06:16:19PM 0 points [-]

I agree that charisma is important, but within the EA movement in particular, you won't get far without intelligence. Intelligence is a necessary qualifier for leadership in EA, in other words.

I have a pretty strong confidence level on that one. I'm ready to make a $100 bet that if you ask EA people whether intelligence is a necessary qualifier for leadership in the EA movement, 9 out of 10 will say yes. Want to take me up on this?

Comment author: OrphanWilde 11 November 2015 06:40:00PM 4 points [-]

Within the current EA movement, or within the EA movement you propose to create by filling your ranks with people who don't share your culture or values?

Comment author: Gleb_Tsipursky 12 November 2015 01:05:32AM -1 points [-]

Within the EA movement currently, but I disagree with the second presumption. So are you taking that bet?

Comment author: OrphanWilde 12 November 2015 12:38:22PM 1 point [-]

I don't think you understand how bets work in staking out certainty, but $100 against $100 implies your certainty that you won't destroy the EA movement is ~50%.

Comment author: ChristianKl 12 November 2015 01:03:08PM 2 points [-]

The bet is not on the question of whether he destroys the EA movement but about whether people say intelligence is important.

Comment author: OrphanWilde 12 November 2015 01:36:02PM 1 point [-]

And he'll find little disagreement from me over what the current group of EA members will say, which says nothing at all about what the group of EA members -after- a successful advertising campaign aimed at increasing membership substantially will say, and at which point the EA movement is, by my view, effectively destroyed, even if it doesn't know it yet.

Comment author: Gleb_Tsipursky 12 November 2015 09:11:45PM *  -1 points [-]

My certainty is against your claim of emotionally oriented people becoming EA leaders, not the destruction of the EA movement. Please avoid shifting goalposts :-) So taking the bet, or taking back your claim?

Comment author: OrphanWilde 12 November 2015 09:40:02PM 2 points [-]

My unwillingness to accept a strawman in place of the positions I have actually stated does not constitute shifting goalposts. But that's irrelevant, compared to the larger mistake you're making, in trying to utilize this technique.

A lesson in Dark Arts: I am Nobody. I could delete this account right now, start over from scratch, and lose nothing but some karma points I don't care about. You, however, are a Somebody. Your account is linked to your identity. Anybody who cares to know who you are, can know who you are. Anybody who already knows who you are can find out what you say here.

As a Somebody, you have credibility. As a Nobody, I have none. So in a war of discrediting - you discredit me, I discredit you - I lose nothing. What do you lose?

Your identify gives you credibility. But it also gives you something to lose. My lack of identity means any credit I gain or lose here is totally meaningless. But it means I have nothing to lose. That means that our credibility disparity is precisely mirrored by a power disparity; one in your favor, the other in mine. But the credibility disparity lasts only until you let yourself be mired in credibility-destroying tactics.

You really shouldn't engage anybody, much less me, in a Dark Arts competition. Indeed, it's vaguely foolish of you to have admitted to the practice of Dark Arts in the first place.

Comment author: Gleb_Tsipursky 12 November 2015 09:58:56PM -1 points [-]

I agree that I have something significant to lose, as my account is tied to my public identity.

However, I do not share your belief that me having acknowledged engaging in Dark Arts is foolish. I am comfortable with being publicly identified as someone who is comfortable with using light forms of Dark Arts, stuff that Less Wrongers generally do not perceive as crossing into real Dark Arts, to promote rationality and Effective Altruism. In fact, I explored this question in a Less Wrong discussion post earlier. I want to be open and transparent, to help myself and Intentional Insights make good decisions and update beliefs.

Comment author: ChristianKl 12 November 2015 01:16:30PM 2 points [-]

I think you go wrong if you assume that "emotionally-oriented" are automatically stupid.

Comment author: Gleb_Tsipursky 12 November 2015 09:10:21PM 0 points [-]

I agree that emotionally-oriented people are not automatically stupid, my point was about what EAs value. If an emotionally-oriented person happens to be also intelligent, then that has certain benefits for the EA movement, of course.

Comment author: ChristianKl 13 November 2015 02:24:54PM 3 points [-]

A person who cares about playing status games might be intelligent but still harmful to the EA movement.

Comment author: Gleb_Tsipursky 13 November 2015 04:13:21PM -1 points [-]

I have a strong probabilistic estimate that there are currently a substantial number of people in the EA movement who care about status games. I'm willing to take a bet on that.