18239018038528017428 comments on Dragon Army: Theory & Charter (30min read) - Less Wrong

40 Post author: Duncan_Sabien 25 May 2017 09:07PM

You are viewing a comment permalink. View the original post to see all comments and the full post content.

Comments (575)

You are viewing a single comment's thread.

Comment author: [deleted] 26 May 2017 08:43:41PM *  26 points [-]

a

Comment author: Valentine 27 May 2017 09:12:21PM *  35 points [-]

PSA:

Do not feed trolls.

In ages past, vitriol like this would be downvoted into oblivion. This was out of recognition that norms of good discourse are more important than the content of arguments. Failure to abide by this spreads rot and makes good communal epistemic hygiene even more difficult.

I notice downvoting is disabled now. Which, sadly, means that people will be tempted to engage with this. Which reinforces a norm of having one's dissent noticed by acting like an unapologetic asshole. Which burns the future of this garden.

So as a close second, I advise just thoroughly ignoring 18239018038528017428 unless and until they step up to meet more noble conversational norms. If there are good points to be made here, they should be converted into the truth-seeking style Less Wrong aspires to so that we can all engage with them in a more hygienic way.

I appreciate Duncan's attempts to do that conversion and speak to the converted form of the argument.

But unless and until I see enough evidence to convince me otherwise, I assume 18239018038528017428's intentions are not truth-seeking. I assume they are inflammatory and will not change via civil discourse.

Ergo, request to all:

Do not feed trolls.

PS: I will follow my own advice here and have no intention of replying to 18239018038528017428 unless and until they transpose their discourse into the key of decency. I expect them to reply to me here, probably with more vitriol and some kind of personal attack and/or attempt to discredit me personally. My ignoring them should be taken as my following my own policy. Note that if 18239018038528017428 does reply with vitriol, it will probably be in some way fashioned as an attempt to make my very refusal to engage look like confirmation of their narrative. Please filter your reading of any replies to my message here accordingly.

Comment author: John_Maxwell_IV 27 May 2017 10:20:48PM *  18 points [-]

I'm the person who advocated most strongly for getting the downvote disabled, and I share some of 18239018038528017428's skepticism about the community in the Bay Area, but I strongly agree with Val's comment. There are already a ton of case studies on the internet in how fragile good conversational norms are. I'm going to email Vaniver and encourage him to delete or edit the vitriol out of comments from 18239018038528017428.

(Also ditto everything Val said about not replying to 18239018038528017428)

Comment author: Vaniver 28 May 2017 12:19:57AM 7 points [-]

I'm going to email Vaniver and encourage him to delete or edit the vitriol out of comments from 18239018038528017428.

Thanks for that; I had already noticed this thread but a policy of reporting things is often helpful. It seemed like Duncan was handling himself well, and that leaving this up was better than censoring it. It seems easier for people to judge the screed fairly with the author's original tone, and so just editing out the vitriol seems problematic.

With the new site, we expect to have mod tools that will be helpful here, like downvoting making this invisible-by-default, to ip-banning and other things to make creating a different throwaway account difficult.

Comment author: komponisto 28 May 2017 07:11:52AM 21 points [-]

For the record: at the risk of being a lonely dissenter, I strongly disagree with any notion that any of this discussion should have been censored in any way. (I was even grateful for the current impossibility of downvoting.)

Five years ago, or even two, my opinion would have been quite different. By this point, however, I have undergone a fairly massive update in the direction of thinking people are far, far too sensitive about matters of "tone" and the like. These norms of sensitivity are used to subtly restrict information flow. Ultimately Duncan and everyone else are better off knowing about the numerically-pseudonymous commenter's opinion in all of its gory detail. In fact, I would go so far as to say that the more they engage with this individual, the better; especially since the natural tendency will be to go in the opposite direction, circle the wagons, and dismiss the critic as a low-status outsider -- a behavior pattern that doesn't need more practice, IMHO.

(At any rate, the individual seems contemptuous enough of their targets that I would expect them to disengage on their own before the full value of discussion with them has been extracted.)

Comment author: John_Maxwell_IV 29 May 2017 03:30:34AM *  6 points [-]

I'm also curious to hear what made you update.

It's true that sensitivity norms can have subtle effects on a conversation, but nastiness norms can too. If you look at the study cited in the "hold off on proposing solutions" essay, you can see a case where politicizing a topic restricts the space of ideas that are explored. (I think this is actually a more natural takeaway from the study than "hold off on proposing solutions".) Nasty conversations also often see evaporative cooling effects where you are eventually just left with hardliners on each side. In general, I think nasty conversations tend to leave any line of reasoning that doesn't clearly support the position of one side or the other under-explored. (This is a pretty big flaw in my opinion, because I think divided opinions are usually an indicator of genuinely mixed evidence. If the evidence is mixed, the correct hypothesis is probably one that finds a way to reconcile almost all of it.) Furthermore I would predict that arguments in nasty conversations are less creative and generally just less well thought through.

Here's another argument. Imagine 18239018038528017428 showed you their draft comment minus the very last sentence. Then they showed you the last sentence "The world would be concretely better off if the author, and anyone like him, killed themselves." Would you tell them to add it in or not? If not, I suspect there's status quo bias, or something like it, in operation here.

Anyway, I think there better ways to address the issue you describe than going full vitriol. For example, I once worked at a company that had a culture of employees ribbing each other, and sometimes we would rib each other about things other employees were doing wrong that would be awkward if they were brought up in a serious manner. I think that worked pretty well.

In fact, I would go so far as to say that the more they engage with this individual, the better; especially since the natural tendency will be to go in the opposite direction, circle the wagons, and dismiss the critic as a low-status outsider -- a behavior pattern that doesn't need more practice, IMHO.

I just want to point out that Duncan did in fact put a tremendous amount of time in to engaging with this critic (more time than he put in to engaging with any other commenter in this thread, by my estimate).

Comment author: komponisto 29 May 2017 05:53:06AM 7 points [-]

My other comment should hopefully clarify things, as least with regard to politicization in particular.

To spell out the implications a bit more: the problem with political discourse, the reason it kills minds, is not that it gets heated; rather, it freezes people's mental categories in ways that prevent them from making ontological updates or paradigm shifts of any kind. In effect, people switch from using physical cognition to think about arguments (modus ponens, etc.), to using social cognition instead (who wins, who loses, etc.). (Most people, of course, never use anything but social cognition in arguments; politics makes even "nerds" or "intellectuals" behave like typical humans.)

It is in fact possible for "heated" or even "nasty" discourse to be very information-rich; this makes sense if you realize that what counts as "nasty" depends on social norms. If you encounter discourse from a different social context (even, for example, simply because the speaker has misunderstood the social context and its norms!) you may read it as "nasty", despite the fact that the author was specifically intending to communicate content.

Now, of course I don't consider 18239018038528017428's comment to be optimally worded -- but then, I wouldn't, because I didn't write it. This is the important thing to understand: there is value to be had in getting detailed input on the mental states of people unlike oneself.

I agree that Duncan deserves positive reinforcement for engaging with this critic to the extent he did. But I think it was actually good for him epistemically to do so, not just as a demonstration of his willingness-to-bend-over-backwards, and thus, good social nature.

Comment author: Evan_Gaensbauer 02 June 2017 06:32:47AM 5 points [-]

As someone who doesn't live in the Bay Area, has no intention of moving there in the near future, and who resents the idea that anyone who wants to be part of what ought to be a worldwide rationality needs to eventually move to the Bay Area to do so. I'm part of the rationality and effective altruism communities, and I too have taken to task community members in the Bay Area for acting as though they can solve community coordination problems with new projects when acknowledgement of the underwhelming success or failure of prior projects never seems to take place. I do that on Facebook, though, where not only my civilian identity and a track record of my behaviour is. There are closed groups or chats where things are less open, so it's not as damaging, and even if I make a post on my own Facebook feed for over one thousand people to see, if I say something wrong, at least it's out in the open so I may face the full consequences of my mistakes.

I know lots of people mentioned in '18239018038528017428' comment. I either didn't know those things about them, or I wouldn't characterize what I did know in such terms. Based on their claims, '18239018038528017428' seems to have more intimate knowledge than I do, and I'd guess is also in or around the Bay Area rationality community as well. Yet they're on this forum anonymously, framing themselves as some underdog taking down high-status community members, when the criteria for such hasn't been established other than "works at MIRI/CFAR", and what they're doing is just insulting and accusing regular people like the rest of us on the internet. They're not facing the consequences of their actions.

The information provided isn't primarily intended to resolve disputes, which I would think ought to be the best application of truth-seeking behaviour in this regard, which is expected as a if not the only primary purpose of discourse here. Primary purposes of '18239018038528017428's comment were to express frustration, slander certain individuals, and undermine and discredit Duncan's project without evidence to back up their claims. These are at cross-purposes with truth-seeking behaviour.

There's nothing I do which is more policed in terms of tone on the basis of sensitivity that '18239018038528017428' isn't doing. While we're talking about norms of sensitivity, let's talk about norms for resolving interpersonal disputes. All the differences between how I and lots of others in the community do it, even if the tone we use isn't always splendid or sensitive, and how '18239018038528017428' do it, are what separates people who have a non-zero respect for norms, and those who don't. This coming from me, a guy who lots of people think probably already flaunts social norms too much.

I am anti-sympathetic to '18239018038528017428' and whether they're censored. Another reason not to resolve interpersonal disputes like this in public on a website like LessWrong is most people in online communities don't like seeing this sort of drama dominate discourse, and in particular there are lots of us who don't care for ever more drama from one zip code being all anyone pays attention to. That defies the purpose of this site, and saps the will of people not in the Bay Area to continue to engage in the rationality community. That's not what anyone needs. Since we've established '18239018038528017428' seems close enough to probably be part of the Berkeley rationality community already, there are plenty of channels like private group chats, mailing lists, or other apps where everyone involved can be connected, but user '18239018038528017428' wouldn't need to out themselves in front of everyone to do it. They could've had had a friend do it.

There are plenty of ways they could've accomplished everything they would've wanted without being censored, and without doing it on LessWrong. When they have access to plenty of online spaces which serve the same purpose, there's no reason LW must allow that speech to the chagrin of all other users. While I get that you think a Chesterton's fence for discourse is being torn down here, I don't believe that's what's going on here, and I think the preferences of everyone else on LessWrong who isn't personally involved deserves a say on what they are and aren't okay with being censored on this site.

Comment author: komponisto 06 June 2017 05:45:55AM 4 points [-]

You don't seem to be addressing what I said very much if at all, but rather to mostly be giving your reaction to 18239018038528017428's comments. This is demonstrated by the fact that you take for granted various assumptions that it was the purpose of my comment to call into question.

In particular, the speech is not being allowed "to the chagrin of all other users". I am notably non-chagrinned by the speech being allowed, and I advocate that people be less chagrinned by such speech being allowed.

Needless to say, to be allowed is not to be approved.

Comment author: entirelyuseless 28 May 2017 01:58:08PM 2 points [-]

By this point, however, I have undergone a fairly massive update in the direction of thinking people are far, far too sensitive about matters of "tone" and the like.

What convinced you of this?

Comment author: komponisto 29 May 2017 04:06:09AM 17 points [-]

What convinced you of this?

A constellation of related realizations.

  • A sense that some of the most interesting and important content in my own field of specialization (e.g. the writings of Heinrich Schenker) violates, or is viewed as violating, the "norms of discourse" of what I took to be my "ingroup" or "social context"; despite being far more interesting, engaging, and relevant to my concerns than the vast majority of discourse that obeys those norms.

  • A sense that I myself, despite being capable of producing interesting content, have been inhibited from doing so by the fear of violating social norms; and that this (which is basically a form of cowardice) is likely to also be what is behind the stifled nature of norm-conforming discourse referred to above.

  • A sense that the ability to look beyond discourse norms (and the signaling value of violation or conformity thereto) and read texts for their information content is extremely intellectually valuable, and in particular, makes texts originating in outgroup or fargroup cultures much more accessible -- the epistemic usefulness of which should go without saying.

  • A sense that a generalized version of this principle holds: the ability to conform to discourse norms, despite their information-obstructing nature, yet still succeed in communicating, functions as a signal of high status or tight embeddedness within a community, achieved via countersignaling. In particular, it cannot be successfully imitated by those not already of similar status or embeddednees: the attempt to imitate Level 4 results in Level 1.

  • A sense that discourse norms, and norms of "civility" generally, are the result of optimization for a purpose entirely distinct from the efficient transmission of information. Namely, they are there to reduce the risk of physical violence; in fact they specifically trade off communicative efficiency for this. Hence: politics, diplomacy, law -- the domains in which discourse is most tightly "regulated" and ritualized being specifically those most concerned with the prevention of physical violence, and simultaneously those most notorious for hypocrisy and obscurantism. This, by contrast, does not seem to be what an internet forum concerned with truth-seeking (or even an associated real-life community of minimally-violent individuals living in a society characterized by historically and globally high levels of trust) is supposed to be optimizing for!

Comment author: Valentine 31 May 2017 01:04:15AM 10 points [-]

Cool. Let's play.

I notice you make a number of claims, but that of the ones I disagree with, none of them have "crux nature" for me. Which is to say, even if we were to hash out our disagreement such that I come to agree with you on the points, I wouldn't change my stance.

(I might find it worthwhile to do that hashing out anyway if the points turn out to have crux nature for you. But in the spirit of good faith, I'll focus on offering you a pathway by which you could convince me.)

But if I dig a bit, I think I see a hint of a possible double crux. You say:

A sense that discourse norms, and norms of "civility" generally, are the result of optimization for a purpose entirely distinct from the efficient transmission of information.

I agree with a steelman version of this. (I don't think it is literally entirely distinct — but I also doubt you do, and I don't want to pressure you to defend wording that I read as being intended for emphasis rather than precise description.) However, I imagine we disagree about how to value that. I think you mean to imply "…and that's bad." Whereas I would add instead "…and that's good."

In a little more detail, I think that civility helps to prevent many more distortions in communication than it causes, in most situations. This is less needed the more technical a field is (whatever that means): in math departments you can just optimize for saying the thing, and if seeming insults come out in the process then that's mostly okay. But when working out social dynamics (like, say, whether a person who's proposing to lead a new kind of rationalist house is trustworthy and doing a good thing), I think distorted thinking is nearly guaranteed without civility.

At which point I cease caring about "efficient transmission of information", basically because I think (a) the information being sent is secretly laced with social subtext that'll affect future transmissions as well as its own perceived truthiness, and (b) the "efficient" transmission is emotionally harder to receive.

So to be succinct, I claim that:

  • (1) Civility prevents more distortion in communication than it creates for a wide range of discussions, including this one about Dragon Army.
  • (2) I am persuadable as per (1). It's a crux for me. Which is to say, if I come to believe (1) is false, then that will significantly move me toward thinking that we shouldn't preserve civility on Less Wrong.
  • (3) If you disagree with me on (1) and (1) is also a crux for you, then we have a double crux, and that should be where we zoom in. And if not, then you should offer a point where you think I disagree with you and where you are persuadable, to see whether that's a point where I am persuadable.

Your turn!

Comment author: John_Maxwell_IV 29 May 2017 06:52:58AM *  7 points [-]

I'm gonna address these thoughts as they apply to this situation. Because you've publicly expressed assent with extreme bluntness, I might conceal my irritation a little less than I normally do (but I won't tell you you should kill yourself).

A sense that some of the most interesting and important content in my own field of specialization (e.g. the writings of Heinrich Schenker) violates, or is viewed as violating, the "norms of discourse" of what I took to be my "ingroup" or "social context"; despite being far more interesting, engaging, and relevant to my concerns than the vast majority of discourse that obeys those norms.

Did he tell people they should kill themselves?

This strikes me as an example of the worst argument in the world. Yes, telling people to kill themselves is an alternative discourse norm, alternative discourse norms can be valuable, but therefore telling people to kill themselves is valuable? Come on. You can easily draw a Venn diagram that refutes this argument. Alternative discourse norms can be achieved while still censoring nastiness.

A sense that I myself, despite being capable of producing interesting content, have been inhibited from doing so by the fear of violating social norms; and that this (which is basically a form of cowardice) is likely to also be what is behind the stifled nature of norm-conforming discourse referred to above.

Telling forum users they should kill themselves is not gonna increase the willingness of people to post to an online forum. In addition to the intimidation factor, it makes Less Wrong look like more of a standard issue internet shithole.

A sense that the ability to look beyond discourse norms (and the signaling value of violation or conformity thereto) and read texts for their information content is extremely intellectually valuable, and in particular, makes texts originating in outgroup or fargroup cultures much more accessible -- the epistemic usefulness of which should go without saying.

This can be a valuable skill and it can still be valuable to censor content-free vitriol.

A sense that a generalized version of this principle holds: the ability to conform to discourse norms, despite their information-obstructing nature, yet still succeed in communicating, functions as a signal of high status or tight embeddedness within a community, achieved via countersignaling. In particular, it cannot be successfully imitated by those not already of similar status or embeddednees: the attempt to imitate Level 4 results in Level 1.

Yes, it takes a lot of effort to avoid telling people that they should kill themselves... Sorry, but I don't really mind using the ability to keep that sort of thought to yourself as a filter.

A sense that discourse norms, and norms of "civility" generally, are the result of optimization for a purpose entirely distinct from the efficient transmission of information. Namely, they are there to reduce the risk of physical violence; in fact they specifically trade off communicative efficiency for this. Hence: politics, diplomacy, law -- the domains in which discourse is most tightly "regulated" and ritualized being specifically those most concerned with the prevention of physical violence, and simultaneously those most notorious for hypocrisy and obscurantism. This, by contrast, does not seem to be what an internet forum concerned with truth-seeking (or even an associated real-life community of minimally-violent individuals living in a society characterized by historically and globally high levels of trust) is supposed to be optimizing for!

If we remove Chesterton's Fences related to violence prevention, I predict the results will not be good for truthseeking. Truthseeking tends to arise in violence-free environments.

Maybe it'd be useful for me to clarify my position: I would be in favor of censoring out the nasty parts while maintaining the comment's information content and probably banning the user who made the comment. This is mainly because I think comments like this create bad second-order effects and people should be punished for making them, not because I want to preserve Duncan's feelings. I care more about trolls being humiliated than censoring their ideas. If a troll delights in taking people down a notch for its own sake, we look like simps if we don't defect in return. Ask any schoolteacher: letting bullies run wild sets a bad precedent. Let me put it this way: bullies in the classroom are bad for truthseeking.

See also http://lesswrong.com/lw/5f/bayesians_vs_barbarians/ Your comment makes you come across as someone who has led a very sheltered upper-class existence. Like, I thought I was sheltered but it clearly gets a lot more extreme. This stuff is not a one-sided tradeoff like you seem to think!

For obvious reasons, it's much easier to convert a nice website to a nasty one than the other way around. And if you want a rationalist 4chan, we already have that. The potential gains from turning the lesswrong.com domain in to another rationalist 4chan seem small, but the potential losses are large.

Comment author: komponisto 29 May 2017 08:09:48AM 10 points [-]

Because you've publicly expressed assent with extreme bluntness

Who said anything about "extreme"?

You are unreasonably fixated on the details of this particular situation (my comment clearly was intended to invoke a much broader context), and on particular verbal features of the anonymous critic's comment. Ironically, however, you have not picked up on the extent to which my disapproval of censorship of that comment was contingent upon its particular nature. It consisted, in the main, of angrily-expressed substantive criticism of the "Berkeley rationalist community". (The parts about people killing themselves were part of the expression of anger, and need not be read literally.) The substance of that criticism may be false, but it is useful to know that someone in the author's position (they seemed to have had contact with members of the community) believes it, or is at least sufficiently angry that they would speak as if they believed it.

I will give you a concession: I possibly went too far in saying I was grateful that downvoting was disabled; maybe that comment's proper place was in "comment score below threshold" minimization-land. But that's about as far as I think the censorship needs to go.

Not, by the way, that I think it would be catastrophic if the comment were edited -- in retrospect, I probably overstated the strength of my preference above -- by my preference is, indeed, that it be left for readers to judge the author.

Now, speaking of tone: the tone of the parent comment is inappropriately hostile to me, especially in light of my other comment in which I addressed you in a distinctly non-hostile tone. You said you were curious about what caused me to update -- this suggested you were interested in a good-faith intellectual discussion about discourse norms in general, such as would have been an appropriate reply to my comment. Instead, it seems, you were simply preparing an ambush, ready to attack me for (I assume) showing too much sympathy for the enemy, with whatever "ammunition" my comment gave you.

I don't wish to continue this argument, both because I have other priorities, and also because I don't wish to be perceived as allying myself in a commenting-faction with the anonymous troublemaker. This is by no means a hill that I am interested in dying on.

However, there is one further remark I must make:

Your comment makes you come across as someone who has led a very sheltered upper-class existence

You are incredibly wrong here, and frankly you ought to know better. (You have data to the contrary.)

Comment author: John_Maxwell_IV 30 May 2017 04:27:21AM 1 point [-]

Well, you've left me pretty confused about the level of importance you place on good-faith discussion norms :P

Comment author: entirelyuseless 29 May 2017 05:17:08PM 1 point [-]

All of these are reasonable points, given the fixed goal of obtaining and sharing as much truth as possible.

But people don't choose goals. They only choose various means to bring about the goals that they already have. This applies both to individuals and to communities. And since they do not choose goals at all, they cannot choose goals by the particular method of saying, "from now on our goal is going to be X," regardless what X is, unless it is already their goal. Thus a community that says, "our goal is truth," does not automatically have the goal of truth, unless it is already their goal.

Most people certainly care much more about not being attacked physically than discovering truth. And most people also care more about not being rudely insulted than about discovering truth. That applies to people who identify as rationalists nearly as much as to anyone else. So you cannot take at face value the claim that LW is "an internet forum concerned with truth-seeking," nor is it helpful to talk about what LW is "supposed to be optimizing for." It is doing what it is actually doing, not necessarily what people say it is doing.

That people should be sensitive about tone is taken in relation to goals like not being rudely insulted, not in relation to truth. And even the argument of John Maxwell that "Truthseeking tends to arise in violence-free environments," is motivated reasoning; what matters for them is the absence of violence (including violent words), and the benefits to truth, if there are any, are secondary.

Comment author: komponisto 29 May 2017 10:02:31PM 2 points [-]

All of these are reasonable points, given the fixed goal of obtaining and sharing as much truth as possible.

Is the implication that they're not reasonable under the assumption that truth, too, trades off against other values?

What the points I presented (perhaps along with other things) convinced me of was not that truth or information takes precedence over all other values, but rather simply that it had been sacrificed too much in service of other values. The pendulum has swung too far in a certain direction.

Above, I made it sound like it the overshooting of the target was severe; but I now think this was exaggerated. That quantitative aspect of my comment should probably be regarded as heated rhetoric in service of my point. It's fairly true in my own case, however, which (you'll hopefully understand) is particularly salient to me. Speaking up about my preoccupations is (I've concluded) something I haven't done nearly enough of. Hence this very discussion.

But people don't choose goals.

This is obviously false, as a general statement. People choose goals all the time. They don't, perhaps, choose their ultimate goals, but I'm not saying that truth-seeking is necessarily anybody's ultimate goal. It's just a value that has been underserved by a social context that was ostensibly designed specifically to serve it.

Most people certainly care much more about not being attacked physically than discovering truth.

But not infinitely much. That's why communicational norms differ among contexts; not all contexts are as tightly regulated as politics, diplomacy, and law. What I'm suggesting is that Less Wrong, an internet forum for discovering truth, can afford to occupy a place toward the looser end of the spectrum of communicational norms.

This, indeed, is possible because a lot of other optimization power has already gone into the prevention of violence; the background society does a lot of this work, and the fact that people are confronting each other remotely over the internet does a fair portion of the rest. And contrary to Maxwell's implication, nobody is talking about removing any Chesterton Fences. Obviously, for example, actual threats of violence are intolerable. (That did not occur here -- though again, I'm much less interested in defending the specific comment originally at issue than in discussing the general principles which, to my mind, this conversation implicates.)

The thing is: not all norms are Chesterton Fences! Most norms are flexible, with fuzzy boundaries that can be shifted in one direction or the other. This includes norms whose purpose is to prevent violence. (Not all norms of diplomacy are entirely unambiguous, let alone ordinary rules of "civil discourse".) The characteristic of fences is that they're bright lines, clear demarcations, without any ambiguity as to which side you're on. And just as surely as they should only be removed with great caution, so too should careful consideration guide their erection in the first place. When possible, the work of norms should be done by ordinary norms, which allow themselves to be adjusted in service of goals.

There are other points to consider, as well, that I haven't even gotten into. For example, it looks conceivable that, in the future, technology, and the way it interacts with society, will make privacy and secrecy less possible; and that social norms predicated upon their possibility will become less effective at their purposes (which may include everything up to the prevention of outright violence). In such a world, it may be important to develop the ability to build trust by disclosing more information, rather than less.

Comment author: entirelyuseless 29 May 2017 11:02:53PM 1 point [-]

I agree with all of this. (Except "this is obviously false," but this is not a real disagreement with what you are saying. When I said people do not choose goals, that was in fact about ultimate goals.)

Comment author: FeepingCreature 31 May 2017 11:41:37AM *  1 point [-]

Five years ago, or even two, my opinion would have been quite different. By this point, however, I have undergone a fairly massive update in the direction of thinking people are far, far too sensitive about matters of "tone" and the like.

Yeah but exposure therapy doesn't work like that though. If people are too sensitive, you can't just rub their faces in the thing they're sensitive about and expect them to change. In fact, what you'd want to desensitize people is the exact opposite - really tight conversation norms that still let people push slightly outside their comfort zone.

Comment author: Maxlove 14 June 2017 07:23:24AM 0 points [-]

There are already a ton of case studies on the internet in how fragile good conversational norms are.

I need access to these studies!

Comment author: cousin_it 31 May 2017 11:36:48AM *  0 points [-]

Out of curiosity, why do you prefer having downvotes disabled? (Here's a comment explaining why I want them back.)

Comment author: Elo 27 May 2017 09:50:36PM 7 points [-]

But unless and until I see evidence otherwise, I assume 18239018038528017428's intentions are not truth-seeking.

Evidence: time and energy put into the comment. Evidence: not staying silent when they could have.

I am not saying theee offending comments are valid, instead I am curious as to why you discounted what I identify as evidence?

Comment author: Valentine 27 May 2017 10:07:04PM 8 points [-]

Ah, I was using a more colloquial definition of evidence, not a technical one. I misspoke.

What goes through my mind here is, "Trolls spend a lot of time and energy making comments like this one too, and don't stay silent when they could, so I'm not at all convinced that those points are more consistent with a world where they're truth-seeking than they are with a world in which they're just trolling."

I still think that's basically true. So to me those points seem irrelevant.

I think what I mean is something more like, "Unless and until I see enough evidence to convince me otherwise…." I'll go back and edit for that correction.

Comment author: komponisto 28 May 2017 07:36:41AM 4 points [-]

norms of good discourse are more important than the content of arguments

In what represents a considerable change of belief on my part, this now strikes me as very probably false.

Comment author: Valentine 28 May 2017 07:37:12PM 1 point [-]

I'm open. Clarify?

Comment author: komponisto 29 May 2017 04:10:27AM 3 points [-]

See this comment; most particularly, the final bullet point.

Comment author: Valentine 31 May 2017 01:04:55AM 0 points [-]
Comment author: Elo 28 May 2017 07:53:57PM 1 point [-]

I offer this model insofar as it helps with communicating about the puzzle -
http://bearlamp.com.au/a-model-of-arguments/
and this one
http://bearlamp.com.au/filter-on-the-way-in-filter-on-the-way-out/

Comment author: Elo 28 May 2017 07:54:38PM 0 points [-]
Comment author: Duncan_Sabien 26 May 2017 08:52:50PM *  18 points [-]

Strong support for this person's willingness to contribute the opposite opinion.

Strong support for this person's willingness to take the time to write things up in detail.

Strong appreciation for the trust implicit in this being posted here (i.e. it's a compliment along the lines of "I expect not to be punished for speaking the truth as I see it.")

Some regret/sadness that they're this triggered and vitriolic, and for the tendency toward choosing the worst or straw-est interpretation at every point rather than taking the time to question their own responses and include nuance, but on the other hand, still appreciation for how this contributes to the overall health of the discussion by opening up new threads for debate and ensuring that there isn't an echo chamber (i.e. maybe it takes that level of aggression to accomplish the thing, and a gentler critique wouldn't be taken seriously enough?).

Significant disagreement with the choice to hijack the topic at hand to vent about things that are either mostly or completely unrelated, and make claims that are unsubstantiated or wildly inaccurate, and engage in some specious logic toward the end (e.g. ad hominem fallacy).

Hope to have some time later today to respond to the better points this raises.

Thanks for your contribution.

Comment author: [deleted] 26 May 2017 09:02:47PM *  5 points [-]

a

Comment author: Elo 26 May 2017 11:43:29PM 8 points [-]

Many words. Probably took a while to write. Some unnecessary things like telling the writer to kill themselves and levelling inherent criticism like attributes of other writing. Other writing is pretty irrelevant to the qualities of this piece. You may have some points in this dung heap but you make it hard to find them. Is it even worth engaging you in conversation?

Comment author: [deleted] 27 May 2017 12:48:06AM *  3 points [-]

a

Comment author: cata 28 May 2017 07:09:21AM 7 points [-]

Perhaps your excessive cognition is ironically blinding you to the grandiose mediocrity of your overwrought replies, such as this one here, which sounds like something I would have written in third grade if I wasn't already too smart to have written it then, which, as a truly capable mind might have already conceived, I was.

Comment author: math_viking 04 June 2017 04:01:32AM 6 points [-]

Your original comment, though harsh, at least contained some useful insights. Don't ruin that by posting comments that are nothing more than 6 lines of insults that no one wants to read.

Comment author: Decius 27 May 2017 12:57:04AM 2 points [-]

Part right.

Most of the arguments you set forth are more fallacious and less relevant than not liking all the author's fiction.

But that's because most of the arguments you set forth were of the type "Bay Area rationalists have had a lot of problems and therefore this specific plan will have similar problems."

Comment author: [deleted] 27 May 2017 01:14:41AM *  3 points [-]

a

Comment author: drethelin 29 May 2017 08:52:53PM 9 points [-]

This is why we need downvotes.

Comment author: Duncan_Sabien 27 May 2017 01:31:47AM *  9 points [-]

[Note: I've typed this comment without refreshing the page, and thus have not seen any of the other responses that may have cropped up in the past few hours, nor taken those responses into account in any way yet. I'm seeing only the original reply, here.]

Part 1 of ?

Repeating my thanks before heading into what will be a mix of concession and disagreement—I have qualms about the way you engaged with this post, but am grateful for the fact that you did engage, at all, rather than just staying quiet, and I want to support the core of that even as I complain about certain aspects of your chosen method.

I think your first paragraph had one clear point: "I, as a smart, perceptive person who sees things others often fail to see, found a lot of this viscerally upsetting, which is probably a sign that there are actual problems." I liked that you added this point, and I think it would've been stronger if you hadn't been so deliberately assholish with the rest of it. I'm going to take the core point seriously as I read further, and see if I can get a clear sense of what it is you see that I don't.

The comment about Ender's Game (paragraph 2) is a misunderstanding on your part, either deliberate or easy to clear up—there's no wargaming in the plan, there's no battle room, there are no other groups of people playacting as other armies. The aesthetic of Dragon Army was, in short: everyone is expected to keep their eyes open and act independently to do what seems right and sane in the moment. Groups should practice coordinating together to build trust and be capable of action-requiring-more-than-one-individual, but the assumption is that an army run by forty minds will trump an army run by one.

In paragraph 3, you make a valid point about the efficacy and usefulness of CFAR, which is indeed worth questioning, and the side you're holding down is not obviously wrong. It's a bit overwrought, given that the phrase "insistence on the validity of his experience as a CFAR instructor" is a clear strawman; I was almost as emphatic about the fact that I've written nerdy fanfic, so I think you were just looking for an opportunity to climb up on a soapbox? That being said, your point about interpersonal romance being a relevant and important factor matches my own intuition, and I wish you had appreciated the fact that I wanted to continue thinking carefully about correct solutions rather than just spam the first ideas that popped into my head.

In paragraph four, you make an entirely unfounded leap that is beneath the quality of what's expected from a poster on this forum. All of your "this suggests" are false handwaving, and I find the rest of your assertions generally laughable, given that there's only one person in this thread so far who's demonstrated deep antisocial behavior, and that you're hurling these insults from a position of anonymity. However, I'm going to continue to take things one paragraph at a time rather than assuming that I've seen your entire position as soon as I've got a mockable straw model, so we'll start fresh with your next point.

Hmmm. In the first sentence of paragraph 5, you and I seem to converge somewhat—we both agree that the Bay Area rationalist community is not living up to its promise, and has too few people doing good and impactful work. I'm glad to share this bit of world-model with you. I note that my idea for what to do about it—try a different sort of house/community—is just one possible strategy among many, and I'm curious if you have other concrete suggestions that you'd be willing to offer. I'm especially curious what you're actually doing, as you seem to have a sort of ... scathing dismissal? ... of everyone else, and I'd expect from your tone that you must be engaged in at least one concretely high-promise project (else it all smacks of rank hypocrisy). Would you be willing to detail a) what you're up to, or b) a few concrete proposals that you suspect are higher promise? At this point, it'd be hard to simply abandon the Dragon Army idea, but if a good enough alternative came along, I would take it. The point is not to be seen to be right, it's to actually make an impact.

I notice that the rest of that paragraph is basically off-topic. Without contributing to the off-topicness, I want to say that I do, indeed, find at least a couple of worthwhile points of agreement within it, but I think most of it is wrong, in addition to being somewhat morally reprehensible re: vicious attacks, and that you're overconfident in your assertions. If you'd like to shoot me a private message, I'd be happy to say where I agree and where I disagree.

Oh, interesting—paragraph six also begins with a claim I have a lot of sympathy for/agreement with. I don't hold it as strongly as you do, but I do think there's a lot of clear dysfunction and self-deception in the community, and I'd like to take steps to correct it. I don't know how to evaluate your claim that the best people are on the periphery (as I'm a weird mix of professionally central and socially somewhat distant), but again—if you'd like to make concrete recommendations about who I should talk to, or direct some of the people you hold in high esteem to comment on this thread, I suspect you're right about there being a lot of untapped value. I do note that Dragon Army is not actually pulling from the central or highest status people, but thus far looks to be made up of a lot of solid, normal, representative rationalists, so I think your claim about trying to delude people is straightforwardly false, as is your assumption that I don't see or don't want to see any warts and flaws. (I believe there are lots of people who will back me up on this, including some who will claim that I've been too hostile or critical. That's partially why I sympathize with the strength of your negativity.)

Comment author: Duncan_Sabien 27 May 2017 01:32:11AM 11 points [-]

Part 2 of 2

Ah, paragraph seven contains the unword "cult," which I think you're using to say something, but I'd rather you just actually said the thing, instead of applying the empty, stretched, multi-interpretation label. Like, I think if you laid out specific, concrete objections, I and others could benefit from them, but just saying cult is lazy name-calling.

I do somewhat agree with your objections to the list of specific skills attained after a year. I had hoped that the large word DRAFT at the top, plus the repeated statements that the whole plan was to iterate, and that I didn't expect to be able to figure out the right stuff on the first try, would've clued you in to the fact that I, too, am aware that the list is inadequate. Do you have specific suggestions for replacements? Keep in mind, the hard problem is to balance things-that-will-be-generally-useful-for-a-medium-sized-group-of-people against the fact that everyone involved has their own specific career and expertise already. Part of the impetus here is social, part of it is becoming well-rounded, part of it is practicing the skill of gaining/improving skills, and all of that is trying to avoid skating into trivial irrelevancy. Got any ideas?

As a meta note, I think that people who cower behind anonymity don't deserve to make concrete claims about their skill sets without backing them up, so until further notice and on a policy level, I'm treating your claim that you meet 11 out of 14 criteria as a flat-out lie (despite its plausibility overall). You're currently nothing and nobody and have no skills; that will change as soon as you a) reveal yourself or b) demonstrate credibility under this pseudonym.

Your next attempt to strawman things takes a sub-point out of context and deliberately ignores the actual requirement being made, which was that people hold their beliefs and models with skepticism/realize that their internal experience does not represent absolute truth, and that they treat one another with a behaviorist's lens, using revealed preferences and past behavior as predictors, rather than relying on mental summations that may be false or straw. I'm curious whether, setting aside your mockery of a subpoint, you agree with that point.

Interestingly enough, I have reasonable credence in your two inferences. In my experience, members of this community do attempt to install norms to compensate for social failings (and do have a somewhat higher-than-average level of social ineptitude). And also, I think many people in this community are low-empathy and embody the bad side of individualism. However, unlike you, I see that a lot of people are trying damn hard to correct this, and I'm curious whether you think they should be written off for not being good enough already, or whether you have specific suggestions that differ from the ones already being tried. I note that a big part of what Dragon Army intends to do is just try a whole bunch of stuff (including stuff already known to work; there's no premium on novelty), and that I think data will be better than armchair ranting.

I suspect you haven't done much in the way of looking in the mirror when you type the words "repressed irritation, interpersonal drama, and general unpleasantness." Certainly you don't meet any of my standards for "how a decent person behaves." I'm going to try to avoid the fundamental attribution error here, though, and assume that we've hit some combination of a) a bad day, b) the problems of online communication, and c) you being unusually triggered or having run out of some important resources.

I'm not going to engage with the ad hominem attack at the end, which, in addition to being wrong as a tactic, also fails in specific. I think that if you compare yourself, who is suggesting suicide as a solution, with OSC, who is definitely wrong about a lot of things but has never gone so far as to claim a fellow human would be better off killing themselves, you'll note that you might be on the wrong side. I'd check my cap for a skull, at least in the context of today's mood.

For anyone else—I welcome calm, reasoned elaboration on any of the on-topic points this person made. When I went through blow-by-blow, there were fewer than I'd hoped, but there are true and valuable and important criticisms here, and I'm glad they've been added to the mix, and I wouldn't mind further discussion of them.

Comment author: [deleted] 27 May 2017 02:03:00AM *  6 points [-]

a

Comment author: Duncan_Sabien 27 May 2017 02:23:25AM 10 points [-]

I don't think you actually succeeded in knocking anyone down a peg, though. I'd bet ~$50 that a neutral, outside observer (say, from a different English speaking country) would say that a) you come off far worse than anyone else in the thread and b) they didn't find your post convincing.

I think our disagreement over the distinction between playacting and not boils down to something like, I believe that the very small nuts-and-bolts of social interaction (jargon, in-jokes, simple trigger-action responses like sneeze "bless you") are more important than most people give them credit for. In other words, I think the silly theater ends up actually mattering? Or, to be more specific—I think most of it doesn't matter, but some small bits of it end up being really important, and so it's an arena I want to do explicit experimentation with. I want to see whether the small salute actually ends up being relevant to bonding and sense-of-purpose, and no, I don't have a double blind or anything like that, but I will be asking a bunch of fairly introspective people for their thoughts afterward.

I suspect, from your reaction, that you'd basically assert that this premise is false, and that the ... skin? ... of social interaction is meaningless, at least compared to the actual connections and information conveyed. This seems like a sensible, plausible position to take, but I think your mockery of the alternative hypothesis is unfounded.

I agree that if romance/sex/etc pop up, that would preclude the problem from being easily solved, but where did you get the impression that I was afraid of attempting to solve hard problems? There's definitely a filter to screen out immature or uncontrolled people; while you yourself might make it through, the persona you're currently expressing would've been rejected by the second paragraph of your original response. We've already turned away people for a variety of reasons, and at least one because of exactly this axis.

I appreciate the recommendation that I run things by Satvik. He's a perceptive thinker and I haven't run this by him yet. I wish that you'd responded in specific to more of my requests to draw out your suggestions—you're continuing to clarify your models of the problems, but not offering much in the way of replacements for the things I'm planning to try.

You're still not saying what you actually mean by the word "cult." There's a decent chance I'd agree with you—I've described the Bay Area rationalist community as a cult myself, even recently, when talking to friends and family members. But I was careful to disambiguate exactly what I meant by that, and I can't help but note that your continued refusal to spell it out makes me suspect that you don't actually have a coherent thing to say, and are just trying to score easy points.

I agree again with 1 (low empathy, etc.) though I think the strength of the effect is smaller than you seem to think it is. I think that you're still not believing me when I say I agree with 2? Note that I'm calling you out for unacceptable rudeness in this thread, for instance. I also suspect you have a huge typical mind thing going on, and vastly underestimate how easy it is for people to rub each other wrong while acting in complete good faith in a normal society—the bed example was maybe poorly chosen, but I disagree with you that it's easy to "default to behavior that is very unlikely to bother others." I've been in a wide range of social milieu, and it's much less about the actual behavior and much more about people's cough willingness to pick nits and start fights.

I think that you've lost all moral authority by doubling down on your "people should die for this" claim, and because of that, I think this'll be my last attempt to engage with you as an equal (you're not my equal; at least this facet of your personality is my clear inferior). I will, however, continue to read if you make those concrete suggestions I'm hoping you have somewhere.

In answer to your last two questions: yes, it looks like your irritation is repressed. Not here, because my main hypothesis is that here is where you finally felt safe to vent a ton of irritation that you've been repressing in other arenas, for long amounts of time. Just look back at your first post—maybe a quarter of it was in response to me, and the rest is long-simmering, long-festering frustration about a bunch of other things (some of them valid and some of them not). Textbook repress-then-explode. And 2, your claim that posting anonymously equates to not causing interpersonal drama is again so laughable that unless it's a deliberate joke, you're revealing this persona to be less socially aware than literally the most awkward and inept rationalist I've ever met.

You're not unpleasant so much as just ... not showing yourself to be worth the time. I really hoped I could get more out of you, because I actually know, on a deep level, that I don't have all the answers and the opposition is the first best place to look. But in terms of useful-criticism-per-word, you've been outdone by every other person who's registered reservation or disagreement here.

Comment author: Pimgd 29 May 2017 10:45:23AM *  5 points [-]

I don't know if I'm neutral (no, because I have an account here for a while now), but I wouldn't have the same confidence to swing that bet out of there like you do. The post in and of itself is not convincing enough for me to say that your idea won't work, but it certainly makes me go "hmm, well, he might have a point there".

Specifically:

  • "Normal" people don't need to explicitly write out all the rules for their housing with regards to social rules.
  • But here there's a large list of rules and activitities and all that with the goal of getting group housing to work properly.
  • Also, here's some examples of the group of people that you want to source your participants from having low social skills.
  • By the way, if you set up a ton of rules then it usually won't work.
  • Thus, there's a pretty big chance that the rules will not work out and that the social skills of the participants will be too low to have the group housing work.

I am not convinced that this is the truth.

However, if I read in a year from now that this is what happened, I would not be surprised.

Basically what I'm saying is I can see 1 or 2 people leaving due to drama despite the rules if you try this, with a chance greater than, I dunno, 10%?

Comment author: JacekLach 30 May 2017 06:20:50PM 4 points [-]

You're looking at content, not status (as implied by 'knocking someone down a peg'). My immediate reaction to the top-level comment was: "well, they have some good points, but damn are they embarassing themselves with this language". Possibly shaped by me being generally sceptical about the ideas in the OP.

As far as the bet is about the form of the post, rather than the content, I think Duncan's pretty safe.

Comment author: Viliam 01 June 2017 01:31:35PM *  3 points [-]

"Normal" people don't need to explicitly write out all the rules for their housing with regards to social rules.

I have seen normies having endless fights about trivial things, such as "who should buy toilet paper", that a simple explicit norm could solve. (For example "people keep buying the paper in turns, when you buy one check this box to keep everyone informed" or "Joe buys the paper, everyone else gives Joe $2 each month" or whatever.)

The best case, of course, would be trying to be nice by default, and solve explicitly the situations where the default behavior fails. But that seems like what would quite likely happen in the Dragon Army anyway... or maybe I am just applying the typical mind fallacy here.

Comment author: Lumifer 01 June 2017 03:26:26PM 2 points [-]

I have seen normies having endless fights about trivial things

You should take the Hansonian approach. Fights over toilet paper are not about toilet paper.

Comment author: math_viking 04 June 2017 03:48:22AM *  3 points [-]

I do somewhat agree with your objections to the list of specific skills attained after a year. I had hoped that the large word DRAFT at the top, plus the repeated statements that the whole plan was to iterate, and that I didn't expect to be able to figure out the right stuff on the first try, would've clued you in to the fact that I, too, am aware that the list is inadequate. Do you have specific suggestions for replacements? Keep in mind, the hard problem is to balance things-that-will-be-generally-useful-for-a-medium-sized-group-of-people against the fact that everyone involved has their own specific career and expertise already. Part of the impetus here is social, part of it is becoming well-rounded, part of it is practicing the skill of gaining/improving skills, and all of that is trying to avoid skating into trivial irrelevancy. Got any ideas?

I'm not the originator of this thread, but that part did resonate with me. I don't think there's anything wrong with those skills, but the combination of choice of skills and the desired level of competency does seem to be decidedly mediocre given the effort and people involved.

1) Above-average physical capacity

What is average? In the US, you could probably be somewhat overweight with no strength, speed, endurance, or agility to speak of and still be "above average."

(2) Above-average introspection

I would expect almost all of the people who volunteer to be part of a rationalist group house to be there or pretty close to there already.

(3) Above-average planning & execution skill (4) Above-average communication/facilitation skill (5) Above-average calibration/debiasing/rationality knowledge

I think my previous comment applies here as well. Perhaps you have a different conception of "average" than I do, but I think if you're going to establish a long-term mini-dictatorship of a group house, you should be aiming for quite a bit higher than "above average."

(6) Above-average scientific lab skill/ability to theorize and rigorously investigate claims

I don't really understand this one. Is your group house actually going to have the ability to practice conducting laboratory experiments? That's a very high overhead endeavor.

(7) Average problem-solving/debugging skill (8) Average public speaking skill (9) Average leadership/coordination skill (10) Average teaching and tutoring skill

Average? Your goals are to reach average, after a year of dedicated effort? Getting into the 80th percentile of anything numbered 1-10 on this list should require a minimum of effort on the part of dedicated individuals following strict rules, unless you have some specific medical condition interfering.

(11) Fundamentals of first aid & survival

How fundamental is fundamental? This also shouldn't take very long if you are willing to put in the effort and practice a bit (2 weeks, at the outside, though you could the true basics in a long weekend). I don't know how it's related to the rest of the goals, though, or why it's important enough to be on the rest of the list. Also, you should practice many of these skills in the actual wilderness, which means time away from everything else.

(12) Fundamentals of financial management

Again, I'm not sure what's "fundamental." You could spend 2 days on this, or the entire year.

(13) At least one of: fundamentals of programming, graphic design, writing, A/V/animation, or similar (employable mental skill) (14) At least one of: fundamentals of woodworking, electrical engineering, welding, plumbing, or similar (employable trade skill)

Do you have the ability to teach/practice trade skills at the house? I would expect leaning any of these things, to an employable level, within a year, would require spending time similar to a full-time job somewhere that has infrastructure, in addition to a significant investment of money (at least a few thousand dollars). (I checked some local welding and plumbing classes at community colleges, which is where I'm getting those numbers).

Someone who already has one of these skills (I'm guess you'll have a few coders at least) is going to be at a tremendous advantage in terms of time and possibly money compared to someone who is not. 13 and 14 are going to each represent a greater time investment than the others combined, unless you already have them.

As a meta note, I think that people who cower behind anonymity don't deserve to make concrete claims about their skill sets without backing them up, so until further notice and on a policy level, I'm treating your claim that you meet 11 out of 14 criteria as a flat-out lie (despite its plausibility overall). You're currently nothing and nobody and have no skills; that will change as soon as you a) reveal yourself or b) demonstrate credibility under this pseudonym.

I don't know if you care, but I would say I already meet a similar number of these criteria. The only one I definitely don't meet is 14. I'm willing to tie this account to my real name and explain/prove why I meet them (though some of them would be quite difficult to really prove, I could only argue).

Comment author: Duncan_Sabien 04 June 2017 01:22:56PM 4 points [-]

The problem seems to be to be the tradeoff between going deep and going wide, with the added complexity that going deep on the wrong thing seems strictly worse than going wide, and so we're defaulting to going wide where there's uncertainty.

Put another way, it's unlikely that any of those specific skills are going to be particularly important to any of our longest-term goals, but it also seems counterproductive to just sit there thinking about which direction to go in. I'm usually not the biggest expert in the room, but I usually am the most generally competent in terms of being able to fill holes or solve whatever problem crops up, and it's because I have a habit of just constantly churning and picking up new skills and methods and heuristics wherever I go. I suspect that others would benefit from a similar habit, in particular because once "the right skill" does come along, you have both the affordance to start learning it and a variety of experiences allowing you to learn quickly and efficiently.

That's a claim. Not necessarily supported, but reasonable, I think, and worth trying out.

I note that I disagree that it's easy to break averages in all of these things at once. People who don't actually check their abilities against a standard tend to be wildly overconfident, and people tend to underestimate how long it will take them to learn X or accomplish Y; these things are solidly documented. And while competence does tend to cluster (e.g. "G"), so the picture's not quite as bleak as the second half of this sentence, once you've got a dozen different domains and shooting to be above the 50% mark in all of them, you're looking at a person who's approximating one in four thousand, and when you try to get a whole group to hit that mark, the challenge is pretty real. I wouldn't be surprised if most people have most of this easy, but I think you're not fully grokking the difficulty of making everybody baseline competent in all of these domains. For instance, you note that many of these skills require only a few weeks, but I don't know if you added up all of those weeks, compared them to the time commitment, and noted that they're all being practiced off-hours and people have their own jobs and lives as well.

It's a floor, though, not a ceiling—we're aiming at "world class skill," we're just not naively expecting that getting there is going to be easy, and initial expectations are meant to be exceeded.

Various additional points ... - The trade skill goal got scaled back in response to another comment; it was the hardest/sketchiest one to begin with. - We will have some ability to practice trade skills at the house, and are adopting a norm of going and seeking professional instruction outside from time to time. - I buy that you meet a large number of these criteria; I meet most of them myself. But the ones I don't have are sticky/tricky.

Comment author: math_viking 04 June 2017 08:48:04PM 1 point [-]

. And while competence does tend to cluster (e.g. "G"), so the picture's not quite as bleak as the second half of this sentence, once you've got a dozen different domains and shooting to be above the 50% mark in all of them, you're looking at a person who's approximating one in four thousand,

I don't think these skills are anywhere near independent. It's also not obvious that they're normally distributed. And, being above the 50% mark in a dozen skills by coincidence being unlikely does not at all tell you how hard it is to gain skills if you put in some deliberate work.

I generally am sympathetic to the argument that stuff can be harder than one assumes, but I also am generally cynical about the "average" level of most of these skills. Most people probably don't even know what "calibration" means precisely enough to test their own level of calibration. I'm not trying to be arrogant here, I pretty much have only heard about the idea of writing down your confidence level of a bunch of predictions and seeing what comes true from the rationalist community and rationalist-adjacent ones.

For the sake of avoiding this issue, and because rather than using terms like "above-average," I would attempt to pin down ahead of time requirements that are as specific as possible to measure progress in each of the areas you care about.

For instance, you note that many of these skills require only a few weeks, but I don't know if you added up all of those weeks, compared them to the time commitment, and noted that they're all being practiced off-hours and people have their own jobs and lives as well.

I don't think it should take a few weeks each to exceed average in most of these skills. I expect it to take a few weeks total (or 1 day a week for a few months).

Comment author: Duncan_Sabien 04 June 2017 11:39:34PM 0 points [-]

I'm plausibly interested in betting a few hundred dollars against you, especially if (as seems likely, given your confidence) you were to bet $1000 against my $250 or something like that. If I imagine the hundred closest people I know uttering the above, I think all but one or two of them are wrong/overconfident.

Comment author: math_viking 05 June 2017 05:10:20AM *  1 point [-]

What statement, specifically, would we be betting on? It's certainly plausible that I'm underestimating the difficulty in getting an entire group to above these standards in comparison to getting one person. Though, I think the main issue may be a difference in what we perceive as average, rather than a model of how hard learning these skills is.

Comment author: Duncan_Sabien 05 June 2017 01:25:13PM 0 points [-]

I spent five minutes trying to operationalize, but I couldn't come up with anything that seemed workable. For now, we'll just proceed knowing that at least one of us is wrong. =)

Comment author: math_viking 05 June 2017 03:48:52PM 1 point [-]

Either way is fine with me, but if you can express in any way what you think "average" is for some of these skills, I would like to know because now I'm really curious.

Thanks for taking so much time to keep responding to a fairly random commenter!

Comment author: ChristianKl 27 May 2017 11:52:21AM 2 points [-]

As a meta note, I think that people who cower behind anonymity don't deserve to make concrete claims about their skill sets without backing them up, so until further notice and on a policy level, I'm treating your claim that you meet 11 out of 14 criteria as a flat-out lie (despite its plausibility overall).

The amount of criteria he hit's likely depends on the definition of average. The reference class matters a great deal.

Comment author: [deleted] 26 May 2017 08:57:39PM *  6 points [-]

a

Comment author: Duncan_Sabien 27 May 2017 01:46:26AM 7 points [-]

Oh, dear. This is terrible, and I wish you hadn't posted it, because there's literally no value to be had in delivering this sort of message in this sort of way. Disendorse; I claim this is evidence that most of your arguments about social capability should be somewhat discounted, since they're coming from someone unskilled.

Comment author: Raemon 27 May 2017 03:10:04AM 16 points [-]

I honestly think this person has been engaged with enough, at least until they make the kind of concrete claims you've been asking for. I think it's commendable to have responded with the good mix of "look at their plausibly good points while calling them out on their bad points", but at some point it becomes uncommendable to engage with people who are clearly not arguing in good faith.

Comment author: Duncan_Sabien 27 May 2017 03:24:59AM 5 points [-]

Yeah, I'm done replying at this point. +1 for the outside view check, though—if I weren't already done, I would've appreciated your intervention.

Comment author: drethelin 29 May 2017 09:17:34PM 2 points [-]

I disagree.

Comment author: Duncan_Sabien 29 May 2017 09:19:24PM 3 points [-]

Fair. Care to put forth a model? You don't have to; simply weighing in is also a contribution (just a less useful one).

Comment author: drethelin 29 May 2017 09:34:05PM 8 points [-]

Our ability to concretely describe the effects of social groups on people in general are kind of limited, but things like "person X joined social group Y and now they concretely do behavior Z" are available. If you see people join a group and then become concretely worse (in your own assessment), I think it can be valuable to refer to specifics. I think it can be important and virtuous to convey what you think is a pernicious process, and unfortunately naming someone you personally know is a very effective, if cruel way to do it. Anecdata, and especially anecdata based on the content of someone's facebook feed, is not a great snapshot of a person at different times, but it's still a source of information.

I'm not sure what you think a better sort of way to deliver this sort of message is, but to some extent any nicer way to do it would be less effective in conveying how bad you think the situation is.

Comment author: Duncan_Sabien 29 May 2017 09:55:41PM 2 points [-]

That seems true and correct to me. I note that my response to this specific comment was ... motivationally entangled? ... with my responses to this person's other comments, and that I was adopting a cross-comment strategy of "try to publicly defend certain norms while engaging with everything else that doesn't violate those norms."

I think it's defensible to say that, in so doing, I lost ... fine-grained resolution? ... on the specific thing being said above, and could've teased out the value that you were able to identify above separate from my defense of a) norms and b) Qiaochu.

Thanks!

Comment author: [deleted] 28 May 2017 05:35:03AM 5 points [-]

Members of the Berkeley rationalist community are particularly low-empathy and embody the worst of individualism, such that they don't actually care whether or not what they're doing might bother others until they're told to stop.

lol

Comment author: grendelkhan 09 June 2017 09:57:06PM 4 points [-]

I strongly support this post.

It would be much better if it were less inflammatory. The last sentence, in particular, is reprehensible. But you respond to the substance of the criticism you get, not the criticism you might want or wish to have at a later time. Otherwise you might as well be slashing your own tires. The vast majority of the discussion below is simple tone policing. Someone's telling you that your house is on fire, and you're complaining that they're shouting.

It's correct that it's incredibly troubling that the author didn't even consider romantic drama in designing his bootcamp. It's correct that these are really not impressive outcomes. They're moderately-functional outcomes. Shouldn't there be some sort of control group where people attempt a similar level of life-changing upward momentum on their own and see if it was actually effective to cede their autonomy? It is correct that trying to LARP a bizarre combination of Ender's Game and Fight Club is perhaps not a sign that this person has any idea how grown-ups work.

And most troubling of all, why weren't these issues noted by anyone who Duncan ran this idea by first? Why does it take this level of willingness to break with social norms to notice the skulls? And no, intoning "I Have Noticed The Skulls" doesn't mean you've actually addressed the problem unless you actually address it. Twelfth virtue!

In a broader sense, what the hell happened? I read the Sequences roughly when they came out, commented here occasionally, moved over to SSC and, more often, the associated subreddit. I donate effectively and regularly, I do my best to tax people's bullshit with bets, and I do feats with spaced repetition. Apparently while I was doing that and not being directly involved in the community, it turned into... this. Scott Alexander is getting published in moderately prestigious outlets. AI risk is mainstream. Effective Altruism is considerably more mainstream than it was. But the community at the center of it has, if anything, regressed, from what I've seen here.

Comment author: Lumifer 09 June 2017 11:00:07PM 0 points [-]

is perhaps not a sign that this person has any idea how grown-ups work.

Maybe it wasn't designed for grown-ups. To quote Duncan,

I'm currently hoping to create a rationality/epistemology/worldsaving bootcamp for middle schoolers.

Comment author: a_different_face 27 May 2017 12:47:19AM *  1 point [-]

despite the efforts of a very valiant man, people have still not realized that autogynephilic men with repressed femininity and a crossdressing fetish pretending to be women aren't actually women

Being only on the periphery of the community, I'm extremely curious who said valiant man is (full disclosure: this is so I can avoid them and/or assess why the community has not yet shunned them, as I would hope they'd shun you).

Comment author: [deleted] 27 May 2017 01:21:15AM *  4 points [-]

a

Comment author: Decius 28 May 2017 06:46:50AM 7 points [-]

For someone who thinks that they are immune to being shunned, you sure do use an anononym.

Comment author: Duncan_Sabien 27 May 2017 01:41:15AM *  4 points [-]

I've had some thoughts and feelings in this vein; skepticism of trans and so forth. I hold that skepticism with skepticism, though, and I do not reach the point of telling the several extremely smart, perceptive, capable, and empathetic trans humans I know that they're e.g. dumb or wrong or sick or confused, when I have no inside view, and I think it's somewhat abhorrent and damaging to the social fabric to start these conversations in any but the most careful and respectful way. That being said, I'd be curious to hear more of the thoughts on the other side of the zeitgeist. If you feel like naming this valiant man in private, I commit to not sharing their name any farther than they themselves say is okay.

Comment author: Zack_M_Davis 27 May 2017 01:39:15PM 10 points [-]

If you feel like naming this valiant man in private, I commit to

Hi! 18239018038528017428 is almost certainly referring to me! (I would have predicted that you'd already have known this from Facebook, but apparently that prediction was wrong.)

somewhat abhorrent and damaging to the social fabric to start these conversations in any but the most careful and respectful way.

I tried that first. It turns out that it doesn't work: any substantive, clearly-worded claims just get adversarially defined as insufficiently respectful. I still had something incredibly important to protect (there is a word for the beautiful feeling at the center of my life, and the word is not woman; I want the right to use my word, and I want the right to do psychology in public and get the right answer), so I started trying other things.

Comment author: tcheasdfjkl 01 June 2017 02:32:26AM 4 points [-]

Zack, I think the problem (from my perspective) is that you tried being respectful in private, and by the time you started talking about this publicly, you were already being really harsh and difficult to talk to. I never got to interact with careful/respectful you on this topic.

(I understand this may have been emotionally necessary/unavoidable for you. But still, from my perspective there was a missing step in your escalation process. Though I should acknowledge that you spurred me to do some reading & writing I would not otherwise have done, and it's not impossible that your harshness jolted me into feeling the need to do that.)

Comment author: Zack_M_Davis 01 June 2017 02:49:07AM 1 point [-]

Yeah, that makes sense. Sorry. Feel free to say more or PM me if you want to try to have a careful-and-respectful discussion now (if you trust me).

Comment author: tcheasdfjkl 01 June 2017 03:21:04AM *  1 point [-]

Thanks. I don't think that would be good for me, at least right now, but thanks for the offer.

My thoughts on the matter are mostly in my ITT entry on Ozy's blog and then also in the most recent thread on this topic on their blog. I guess I'd be somewhat curious about your responses to those thoughts.

Comment author: entirelyuseless 27 May 2017 04:09:29PM 2 points [-]

any substantive, clearly-worded claims just get adversarially defined as insufficiently respectful

I agree. E.g. Scott Alexander has said he will ban people from his blog is they do not speak as if the trans theories were true, even if they believe them to be false. But that doesn't mean it is a good option to be as rude as possible, like 18239018038528017428 above. (Obviously I am not saying that you have adopted this approach either.)

Comment author: komponisto 27 May 2017 09:30:29AM 9 points [-]

I do not reach the point of telling the...humans I know that they're e.g. dumb or wrong or sick or confused

If you'll allow me, I would like to raise a red-flag alert at this sentence. It seems poorly worded at best, and in worse scenarios indicative of some potentially-bad patterns of thought.

Presumably, as a member of a community of aspiring rationalists, not to mention the staff of CFAR, telling the people you know when (you think) they're wrong or confused is, or should be...your daily bread. (It goes without saying that this extends to noticing your own confusion or wrongness, and encouraging others to notice it for you when you don't; the norm, as I understand it, is a cooperative one).

Telling people when they might be sick is (if you'll forgive me) hardly something to sneeze at, either. They might want to visit a doctor. Health is, for understandable reasons, generally considered important. (This includes mental health.)

As for dumb, well, I simply doubt that comes up often enough to make the statement meaningful. Whatever may be said about the rationalist community, it does not appear to draw its membership disproportionately from those of specifically low intelligence. Your acquaintances -- whatever their other characteristics -- probably aren't "dumb", so to tell them they are would simply be to assert a falsehood.

So: may I be so bold as to suggest either a reformulation of the thought you were trying to express, or even a reconsideration of the impulse behind it, in the event that the impulse in question wasn't actually a good one?

Comment author: Fluttershy 28 May 2017 08:16:11PM 3 points [-]

Duncan's original wording here was fine. The phrase "telling the humans I know that they're dumb or wrong or sick or confused" is meant in the sense of "socially punishing them by making claims in a certain way, when those claims could easily be made without having that effect".

To put it another way, my view is that Duncan is trying to refrain from adopting behavior that lumps in values (boo trans people) with claims (trans people disproportionately have certain traits). I think that's a good thing to do for a number of reasons, and have been trying to push the debate in that direction by calling people out (with varying amounts of force) when they have been quick to slip in propositions about values into their claims.

I'm frustrated by your comment, komponisto, since raising a red-flag alert, saying that something is poorly worded at best, and making a large number of more subtle negative implications about what they've written are all ways of socially discouraging someone from doing something. I think that Duncan's comment was fine, I certainly think that he didn't need to apologize for it, and I'm fucking appalled that this conversation as a whole has managed to simultaneously promote slipping value propositions into factual claims, and promote indirectly encouraging social rudeness, and then successfully assert in social reality that a certain type of overtly abrasive value-loaded proposition making is more cooperative and epistemically useful than a more naturally kind style of non-value-loaded proposition making, all without anyone actually saying something about this.

Comment author: komponisto 29 May 2017 05:18:15AM 5 points [-]

Your principal mistake lies here:

"socially punishing them by making claims in a certain way, when those claims could easily be made without having that effect

Putting communication through a filter imposes a cost, which will inevitably tend to discourage communication in the long term. Moreover, the cost is not the same for everyone: for some people "diplomatic" communication comes much more naturally than for others; as I indicate in another comment, this often has to do with their status, which, the higher it is, the less necessary directness is, because the more people are already preoccupied with mentally modeling them.

I'm frustrated by your comment, komponisto

If we're engaging in disclosures of this sort, I have felt similarly about many a comment of yours, not least the one to which I am replying. In your second paragraph, for example, you engage in passive aggression by deceptively failing to acknowledge that the people you are criticizing would accuse you of the exact same sin you accuse them of (namely, equating "trans people disproportionately have certain traits" and "boo trans people"). That's not a debate I consider myself to be involved in, but I do, increasingly, feel myself to be involved in a meta-dispute about the relative importance of communicative clarity and so-called "niceness", and in that dispute, come down firmly on the side of communicative clarity -- at least as it pertains to this sort of social context.

I read your comment as a tribal cheer for the other, "niceness", side, disingenuously phrased as if I were expected to agree with your underlying assumptions, despite the fact that my comments have strongly implied (and now explicitly state) that I don't.

Comment author: Fluttershy 30 May 2017 05:36:48AM 2 points [-]

Putting communication through a filter imposes a cost, which will inevitably tend to discourage communication in the long term.

As does allowing people to be unduly abrasive. But on top of that, communities where conversations are abrasive attract a lower caliber of person than one where they aren't. Look at what happened to LW.

Moreover, the cost is not the same for everyone

It's fairly common for this cost to go down with practice. Moreover, it seems like there's an incentive gradient at work here; the only way to gauge how costly it is for someone to act decently is to ask them how costly it is to them, and the more costly they claim it to be, the more the balance of discussion will reward them by letting them impose costs on others via nastiness while reaping the rewards of getting to achieve their political and interpersonal goals with that nastiness.

I'm not necessarily claiming that you or any specific person is acting this way; I'm just saying that this incentive gradient exists in this community, and economically rational actors would be expected to follow it.

communicative clarity and so-called "niceness"

That's a horrible framing. Niceness is sometimes important, but what really matters is establishing a set of social norms that incentivize behaviors in a way that leads to the largest positive impact. Sometimes that involves prioritizing communicative clarity (when suggesting that some EA organizations are less effective than previously thought), and sometimes that involves, say, penalizing people for acting on claims they've made to other's emotional resources (reprimanding someone for being rude when that rudeness could have reasonably been expected to hurt someone and was entirely uncalled for). Note that the set of social norms used by normal folks would have gotten both of these cases mostly right, and we tend to get them both mostly wrong.

Comment author: komponisto 30 May 2017 07:00:48AM 7 points [-]

communities where conversations are abrasive attract a lower caliber of person than one where they aren't. Look at what happened to LW.

To whatever extent this is accurate and not just a correlation-causation conversion, this very dynamic is the kind of thing that LW exists (existed) to correct. To yield to it is essentially to give up the entire game.

What it looks like to me is that LW and its associated "institutions" and subcultures are in the process of dissolving and being absorbed into various parts of general society. You are basically endorsing this process, specifically the aspect wherein unique subcultural norms are being overwritten by general societal norms.

The way this comes about is that the high-status members of the subculture eventually become tempted by the prospect of high status in general society, and so in effect "sell out". Unless previously-lower-status members "step up" to take their place (by becoming as interesting as the original leaders were), the subculture dies, either collapsing due to a power vacuum, or simply by being memetically eaten by the general culture as members continue to follow the old leaders into (what looks like) the promised land.

Comment author: Zack_M_Davis 30 May 2017 04:36:45PM 4 points [-]

Moreover, it seems like there's an incentive gradient at work here; the only way to gauge how costly it is for someone to act decently is to ask them how costly it is to them, and the more costly they claim it to be, the more the balance of discussion will reward them by letting them impose costs on others via nastiness while reaping the rewards of getting to achieve their political and interpersonal goals with that nastiness.

I agree that the incentives you describe exist, but the analysis cuts both ways: the more someone claims to have been harmed by allegedly-nasty speech, the more the balance of discussion will reward them by letting them restrict speech while reaping the rewards of getting to achieve their political and interpersonal goals with those speech restrictions.

Interpersonal utility aggregation might not be the right way to think of these kinds of situations. If Alice says a thing even though Bob has told her that the thing is nasty and that Alice is causing immense harm by saying it, Alice's true rejection of Bob's complaint probably isn't, "Yes, I'm inflicting c units of objective emotional harm on others, but modifying my speech at all would entail c+1 units of objective emotional harm to me, therefore the global utilitarian calculus favors my speech." It's probably: "I'm not a utilitarian and I reject your standard of decency."

Comment author: Duncan_Sabien 27 May 2017 04:39:05PM 3 points [-]

This is a fair point. I absolutely do hold as my "daily bread" letting people know when my sense is that they're wrong or confused, but it becomes trickier when you're talking about very LARGE topics that represent a large portion of someone's identity, and I proceed more carefully because of both a) politeness/kindness and b) a greater sense that the other person has probably thought things through.

I don't have the spoons to reformulate the thought right now, but I think your call-out was correct, and if you take it on yourself to moderately steelman the thing I might have been saying, that'll be closer to what I was struggling to express. The impulse behind making the statement in the first place was to try to highlight a valuable distinction between pumping against the zeitgeist/having idiosyncratic thoughts, and just being a total jerk. You can and should try to do the former, and you can and should try to avoid the latter. That was my main point.

Comment author: komponisto 27 May 2017 10:58:47PM 9 points [-]

Here's what it looks like to me, after a bit of reflection: you're in a state where you think a certain proposition P has a chance of being true, which it is considered a violation of social norms to assert (a situation that comes up more often than we would like).

In this sort of situation, I don't think it's necessarily correct to go around loudly asserting, or even mentioning, P. However, I do think it's probably correct to avoid taking it upon oneself to enforce the (epistemically-deleterious) social norm upon those weird contrarians who, for whatever reason, do go around proclaiming P. At least leave that to the people who are confident that P is false. Otherwise, you are doing epistemic anti-work, by systematically un-correlating normative group beliefs from reality.

My sense was that you were sort of doing that above: you were seeking to reproach someone for being loudly contrarian in a direction that, from your perspective (according to what you say), may well be the right one. This is against your and your friends' epistemic interests.

(A friendly reminder, finally, that talk of "being a total jerk" and similar is simply talk about social norms and their enforcement.)

Comment author: Duncan_Sabien 28 May 2017 05:17:22AM 2 points [-]

I was not aiming to do "that above." To the extent that I was/came across that way, I disendorse, and appreciate you providing me the chance to clarify. Your models here sound correct to me in general.

Comment author: Fluttershy 28 May 2017 08:21:05PM 0 points [-]

Your comment was perfectly fine, and you don't need to apologize; see my response to komponisto above for my reasons for saying that. Apologies on my part as there's a strong chance I'll be without internet for several days and likely won't be able to further engage with this topic.

Comment author: ChristianKl 27 May 2017 12:13:02PM 2 points [-]

In most cases calling someone sick when the person suffers from a mental issue isn't the best way to get them to seek professional help for it.

Comment author: komponisto 27 May 2017 10:58:56PM 1 point [-]

What is the best way? It's not like you can trick them into it.

A more serious issue, I would have thought, would be that the "professional help" won't actually be effective.

Comment author: ChristianKl 28 May 2017 11:28:04AM 1 point [-]

If you don't have any specific tools, I would advocate a mix of asking questions to help the other person clarify their thinking and providing information.

"Did you symptoms X and Y are signs of clinical mental illness Z?" is likely more effective than telling the person "You have mental illness Z."

If the other person doesn't feel judged but can explore the issue in a safe space where they are comfortable of working through an ugh-field, it's more likely that they will end up doing what's right afterwards.

Comment author: komponisto 29 May 2017 10:42:51AM 0 points [-]

I don't think "Did you know symptoms X and Y are signs of clinical mental illness Z?" is appreciably different from "You very possibly have mental illness Z", which is the practical way that "You have mental illness Z" would actually be phrased in most contexts where this would be likely to come up.

Nevertheless, your first and third paragraphs seem right.

Comment author: ChristianKl 29 May 2017 01:49:36PM 0 points [-]

In a conversation, you get another reaction if you ask a question that indirectly implies that the other person has a mental illness than if you are direct about it. The phrasing of information matters.

Comment author: a_different_face 27 May 2017 03:01:17AM *  3 points [-]

This is about behavior, not belief.

I have not disputed "autogynephilic men with repressed femininity and a crossdressing fetish pretending to be women aren't actually women", though neither have I affirmed it.

Regardless, I still would not want you, personally, in any community I'm part of, because your behavior is bad. I'm not interested in debating this this; obviously we disagree on what acceptable behavior looks like. Whatever; different strokes for different folks - clearly this community is not for you, but also you seem to still be here, for some reason.

And I would still want to know who's going around trying to convince people of that statement, so that I could avoid them (for their proselytizing, not for their beliefs) and/or assess why the community has not yet shunned them. (Obviously you can shun the community while it simultaneously shuns you. These are not mutually exclusive.)

So, again, I still want to know who you're talking about. Who are you talking about?

Comment author: Zack_M_Davis 27 May 2017 12:44:27PM 6 points [-]

Hi! 18239018038528017428 is almost certainly talking about me! My detailed views are probably more nuanced and less objectionable than you might infer from the discussion in this thread? But to help you assess for yourself why "the community" (whatever that is) has not yet shunned me, maybe start with this comment (which also contains links to my new gender blog).

Comment author: a_different_face 27 May 2017 03:07:12PM 1 point [-]

Ah, thanks. Turns out I do know who you are and have already thought about the question of why (and to what extent) the community continues to interact with you to my satisfaction. (And yes, the throwaway's description of you is somewhat misleading, though mostly that's because, from their behavior, I would expect anyone they praise to be terrible without redeeming features).

Comment author: Zack_M_Davis 28 May 2017 01:28:09AM 5 points [-]

have already thought about the question of why (and to what extent) the community continues to interact with you to my satisfaction.

For obvious reasons, I'm extremely curious to hear your analysis if you're willing to share. (Feel free to PM me.)

from their behavior, I would expect anyone they praise to be terrible without redeeming features

I don't think that's a good inference! (See the anti-halo effect and "Are Your Enemies Innately Evil?") Even if you think the throwaway's rudeness and hostility makes them terrible, does it really make sense for guilt-by-association to propagate to anyone the throwaway approves of for any reason?

(from the great-grandparent)

This is about behavior, not belief. [...] (for their proselytizing, not for their beliefs)

I think it would be less cruel and more honest to just advocate for punishing people who believe a claim, rather than to advocate for punishing people who argue for the claim while simultaneously insisting that this isn't a punishment for the belief. What would be the point of restricting speech if the goal isn't to restrict thought?

Comment author: a_different_face 30 May 2017 12:34:10AM 0 points [-]

For obvious reasons, I'm extremely curious to hear your analysis if you're willing to share. (Feel free to PM me.)

Probably this is going to be too blunt, but it's honest, and I'm assuming you'd prefer that:

Basically, because you are psychotic, not an asshole (or at least, afaict, only an asshole as a consequence). And dealing with people who are behaving poorly because of mental issues is a hard problem, especially in a community where so many people have mental issues of one sort or another.

Again, this doesn't mean I disagree with you (and again neither have I claimed to agree). The fact of your psychosis is not obviously prior to your beliefs. But it is very obviously prior to how you have acted on those beliefs. Or at least it is obvious to me, having spent a great deal of time with friends who behave like you've behaved (in public, at any rate; of course you should discount this evidence given that I haven't interacted with you in person, or at least not much).

Even if you think the throwaway's rudeness and hostility makes them terrible, does it really make sense for guilt-by-association to propagate to anyone the throwaway approves of for any reason?

It's evidence, yes.

I think it would be less cruel and more honest to just advocate for punishing people who believe a claim, rather than to advocate for punishing people who argue for the claim while simultaneously insisting that this isn't a punishment for the belief. What would be the point of restricting speech if the goal isn't to restrict thought?

... This is a much larger conversation for another time. If you have not already internalized "just because I believe something is true does not make it socially acceptable for me to go around trying to convince everyone else that it's true", I don't know that I will be able to briefly explain to you why that is the case.

Comment author: Zack_M_Davis 30 May 2017 04:10:18AM 2 points [-]

but it's honest, and I'm assuming you'd prefer that

Yes, thank you!

Basically, because you are psychotic

I definitely went through some psychosis states back in February and April, but I seem to be pretty stably back to my old self now. (For whatever that might be worth!) I have a lot of regrets about this period, but I don't regret most of my public comments.

If you have not already internalized "just because I believe something is true does not make it socially acceptable for me to go around trying to convince everyone else that it's true", I don't know that I will be able to briefly explain to you why that is the case.

Oh, I think I understand why; I'm not that socially retarded. Even so—if there's going to be one goddamned place in the entire goddamned world where people put relatively more emphasis on "arguing for true propositions about human psychology because they're true" and relatively less emphasis on social acceptability, shouldn't it be us? I could believe that there are such things as information hazards—I wouldn't publicize instructions on how to cheaply build a suitcase nuke—but this isn't one of them.

Comment author: a_different_face 30 May 2017 05:14:20AM 1 point [-]

if there's going to be one goddamned place in the entire goddamned world where people put relatively more emphasis on "arguing for true propositions about human psychology because they're true" and relatively less emphasis on social acceptability, shouldn't it be us?

Sure. And we do put relatively more emphasis. But we have not completely and totally thrown away all social convention. Nor should we: much of it exists for good reason.

Comment author: 29f8c80d-235a-47bc-b 28 May 2017 11:35:40PM 2 points [-]

That seems so obviously true the idea of shunning someone for fighting against people arguing the opposite seems crazy to me. I thought we just called used "she" to be polite, not thought we believed them to be women in any meaningful sense.

Comment author: a_different_face 30 May 2017 12:36:14AM 3 points [-]

I cannot imagine participating in this community for any length of time and sincerely concluding that the mental state you've described is actually universal.

Comment author: Fluttershy 28 May 2017 07:03:09PM *  3 points [-]

assess why the community has not yet shunned them

Hi! I believe I'm the only person to try shunning them, which happened on Facebook a month ago (since Zack named himself in the comments, see here, and here). The effort more or less blew up in my face and got a few people to publicly say they were going to excluded me, or try to get others to exclude me from future community events, and was also a large (but not the only) factor in getting me to step down from a leadership position in a project I'm spending about half of my time on. To be fair, there are a couple of places where Zack is less welcome now also, (I don't think either of us have been successfully excluded from anything other than privately hosted events we weren't likely to go to anyways), and someone with the viewpoint that shunning him was the wrong thing for me to do also stepped down from an equivalent leadership position in order to maintain a balance. So, I guess we're in a stalemate-like de facto ceasefire, though I'd be happy to pick up the issue again.

I still stand by my response to Zack. It would have been better if I'd been skilled enough to convince him to use a less aggressive tone throughout his writing by being gentler myself; that's an area where I'm still trying to grow. I think that collaborative truthseeking is aided rather than hindered by shunning people who call others "delusional perverts" because of their gender. This is, at least in part, because keeping discussions focused on truthseeking, impact, etc. is easier when there are social incentives (i.e. small social nudges that can later escalate to shunning) in place that disincentivize people from acting in ways that predictably push others into a state where they're hurt enough that they're unable to collaborate with you, such as by calling them delusional perverts. I know that the process of applying said social incentives (i.e. shunning) doesn't look like truthseeking, but it's instrumental to truthseeking (when done with specificity and sensitivity/by people with a well-calibrated set of certain common social skills).

Comment author: Zack_M_Davis 31 May 2017 10:14:50PM 6 points [-]

(Just noticed this.)

a large (but not the only) factor in getting me to step down from a leadership position in a project I'm spending about half of my time on. [...] and someone with the viewpoint that shunning him was the wrong thing for me to do also stepped down from an equivalent leadership position in order to maintain a balance.

I wasn't aware of this, but it seems unfortunate. If successfully ostracizing me isn't going to happen anyway, "both of you step down from something that you previously wanted to do" seems like a worse outcome than "neither of you step down."

(For my own part, while I wouldn't invite you to any parties I host at my house, I have no interest in trying to get other people to exclude you from their events. I consider my goal in this whole affair as simply to make it clear that I don't intend to let social pressure influence my writing—a goal at which I think I've succeeded.)

shunning people who call others "delusional perverts" because of their gender

I hadn't bothered addressing this earlier, because I wanted to emphasize that my true rejection was "I don't negotiate with emotional blackmailers; I'm happy to listen and update on substantive criticism of my writing, but appeal to consequences is not a substantive criticism", but since it is relevant, I really think you've misunderstood the point of that post: try reading the second and third paragraphs again.

What I'm trying to do there is highlight my disapproval of the phenomenon where the perceived emotional valence of language overshadows its literal content. I understand very well that the phrase "delusional pervert" constitutes fighting words in a way that "paraphilic with mistaken views" doesn't, but I'm interested in developing the skill of being able to simultaneously contemplate framings with different ideological/emotional charges, especially including framings that make me and my friends look bad (precisely because those are the ones it's most emotionally tempting to overlook). People who aren't interested in this skill probably shouldn't read my blog, as the trigger warning page explains.

(Seriously, why isn't the trigger warning page good enough for you? It's one thing to say my writing to should have a label to protect the sensitive, but it's another thing to say that you don't want my thoughts to exist!)

It would have been better if I'd been skilled enough to convince him to use a less aggressive tone throughout his writing by being gentler myself

Not all goals are achievable by sufficiently-skilled gentle social manipulation. If you can show me an argument that can persuade me to change my behavior given my values, then I'll do so. If no such argument exists, then your skill and gentleness don't matter. (At least, I hope I'm not that hackable!)

Comment author: Elo 28 May 2017 07:16:42PM 2 points [-]

it sounds like something happened and there was some miscommunication and things are not fully healed. Would you like help with that?

Comment author: Fluttershy 30 May 2017 04:54:27AM 0 points [-]

I appreciate your offer to talk things out together! To the extent that I'm feeling bad and would feel better after talking things out, I'm inclined to say that my current feelings are serving a purpose, i.e. to encourage me to keep pressing on this issue whenever doing so is impactful. So I prefer to not be consoled until the root issue has been addressed, though that wouldn't have been at all true of the old version of myself. This algorithm is a bit new to me, and I'm not sure if it'll stick.

Overall, I'm not aware that I've caused the balance of the discussion (i.e. pro immediate abrasive truthseeking vs. pro incentives that encourage later collaborative truthseeking & prosociality) to shift noticeably in either way, though I might have made it sound like I made less progress than I did, since I was sort of ranting/acting like I was looking for support above.

Comment author: Zack_M_Davis 31 May 2017 10:32:58PM 5 points [-]

encourage me to keep pressing on this issue whenever doing so is impactful. So I prefer to not be consoled until the root issue has been addressed

Is this really a winning move for you? I'm not budging. It doesn't look like you have a coalition that can deny me anything I care about. From my perspective, any activity spreading the message "Zack M. Davis should be shunned because of his writing at http://unremediatedgender.space/" is just free marketing.

Comment author: The_Jaded_One 14 June 2017 07:50:03PM 0 points [-]

someone was accidentally impregnated and then decided not to abort the child, going against what had previously been agreed upon, and proceeded to shamelessly solicit donations from the rationalist community to support her child

They were just doing their part against dysgenics and should be commended.

Comment author: The_Jaded_One 14 June 2017 07:47:28PM 0 points [-]

word is going around that Anna Salamon and Nate Soares are engaging in bizarre conspiratorial planning around some unsubstantiated belief that the world will end in ten years

Sounds interesting, I'd like to hear more about this.