Less Wrong is a community blog devoted to refining the art of human rationality. Please visit our About page for more information.

Comment author: Elo 18 March 2017 09:20:47PM 12 points [-]

history is written by the people who write it down. if you want to change history; write something different down.

Comment author: Zack_M_Davis 18 March 2017 09:26:19PM 5 points [-]

I agree! Indeed, your comment is a response to the something different that I wrote down! If I cared more about correcting this particular historical error, I would do more research and write something more down in a place that would get more views than this Less Wrong Discussion thread. Unfortunately, I'm kind of busy, so the grandparent is all that I bothered with!

Comment author: James_Miller 18 March 2017 07:31:27PM 6 points [-]

"Elsewhere on the internet, another fearsomely intelligent group of thinkers prepared to assault the secular religions of the establishment: the neoreactionaries, also known as #NRx."

"Neoreactionaries appeared quite by accident, growing from debates on LessWrong.com, a community blog set up by Silicon Valley machine intelligence researcher Eliezer Yudkowsky. The purpose of the blog was to explore ways to apply the latest research on cognitive science to overcome human bias, including bias in political thought and philosophy."

"LessWrong urged its community members to think like machines rather than humans. Contributors were encouraged to strip away self-censorship, concern for one’s social standing, concern for other people’s feelings, and any other inhibitors to rational thought. It’s not hard to see how a group of heretical, piety-destroying thinkers emerged from this environment — nor how their rational approach might clash with the feelings-first mentality of much contemporary journalism and even academic writing."

This article currently has 32,760 Facebook shares.

Comment author: Zack_M_Davis 18 March 2017 09:00:05PM 11 points [-]

But, but, this is not historically accurate! I'm sure there's a much greater overlap between Less Wrong readers and Unqualified Reservations readers than you would expect between an arbitrary pairing of blogs, but the explanation for that has to look something like "Yudkowsky and Moldbug both attract a certain type of contrarian nerd, and so you get some links from one community to the other from the few contrarian nerds that are part of both." The causality doesn't flow from us!

Comment author: Z._M._Davis 18 May 2008 01:58:32AM 6 points [-]

Eliezer, I have to second Hopefully, Recovering, et al.: good points (as almost always), but the Science versus Bayescraft rhetoric is a disaster. Lone autodidacts railing against the failings of Mainstream Science are almost always crackpots--that you're probably right, doesn't mean you can expect people to ignore that likelihood ratio when deciding whether or not to pay attention to you. "Meaning does not excuse impact!"

Concerning the qualitative vs. quantitative Bayescraft issue: taking qualitative lessons like Conservation of Expected Evidence from probability theory is clearly fruitful, but I wonder if we shouldn't be a little worried about Solmonoff induction. Take the example of Maxwell's equations being a simpler computer program than anger. Even though we have reason to suppose that it's possible in principle to make a computer program simulating anger-in-general--anger runs on brains; brains run on physics; physics is computable (isn't it?)--I don't wonder if it shouldn't make us a bit nervous that we really have no idea how to even begin writing such a program (modulo that "No One Knows What Science," &c.). The obvious response would be to say that all we need is "just" a computer program that duplicates whatever angry human brains do, but I don't think that counts as a solution if we don't know exactly how to reduce anger-in-general to math. A convincing knockdown of dualism doesn't make the Hard Problem any less confusing.

Maybe all this is properly answered by repeating that the math is out there, whether or not we actually know how to do the calculation. After all, given that there is a program for anger, it would obviously be longer than the one for electromagnetism. Still, I worry about putting too much trust in a formalism that is not just computationally intractible, but that we don't really know how to use, for if anyone really knew in concrete detail how to reduce thought to computation in any but the most trivial of cases, she'd essentially have solved the AGI problem, right?

Or take Pascal's Mugging. If I recall correctly from the discussion at the February meetup, the current best solution to the problem is that given a universe big enough to contain 3^^^^3 minds, the prior probability of any one causal node exerting so much influence is low enough to overcome the vast disutility of the mugger's threat. Eliezer noted that that this would imply that you're not allowed to believe the mugger even if she takes you out of the Matrix and shows you the hardware. This seems much like ruling out the mugger's claim a priori--which I guess is the result we "want," but it seems far too convenient.

Of course, it is possible that I simply don't know enough math to see that everything I just said is actually nonsense. Sorry for the long comment.

Comment author: Zack_M_Davis 16 March 2017 07:50:18PM *  1 point [-]

but the Science versus Bayescraft rhetoric is a disaster.

What's wrong with you? It's true that people who don't already have a reason to pay attention to Eliezer could point to this and say, "Ha! An anti-science crank! We should scorn him and laugh!", and it's true that being on the record saying things that look bad can be instrumentally detrimental towards achieving one's other goals.

But all human progress depends on someone having the guts to just do things that make sense or say things that are true in clear language even if it looks bad if your head is stuffed with the memetic detritus of the equilibrium of the crap that everyone else is already doing and saying. Eliezer doesn't need your marketing advice.

But you probably won't understand what I'm talking about for another eight years, ten months.

In response to comment by satt on Am I Really an X?
Comment author: Viliam 10 March 2017 05:51:11PM 0 points [-]

One important difference: The linked article is a description of its author's experience. This article proposed a general explanation.

When someone provides a personal data point, as long as I don't suspect that person of lying, I have no reason to disagree. (Unless the person would conclude "everyone else is just like me", which would be the sin of generalisation from one example.)

Here, Gram_Stone provides one hypothesis, Zack_M_Davis provides another... and there are many people who believe to be experts in the topic and support one or the other... unless of course they merely want to support their tribe. None of these dozens of experts provides a scientific reference for their side; apparently doing so is superfluous because the matter is settled.

Historically, we also had the downvote button.

In response to comment by Viliam on Am I Really an X?
Comment author: Zack_M_Davis 14 March 2017 11:05:26PM *  1 point [-]

Zack_M_Davis provides another... [...] None of these dozens of experts provides a scientific reference for their side

You probably missed it (last paragraph of this comment), but I did in fact reference a blog FAQ and a book (official website, PDF that someone put online, probably in defiance of copyright law). These are both secondary sources, but with plenty of citations back to the original studies in the psychology literature (some of which I've read myself; I don't recall noticing anything being dishonestly cited to claim something that it didn't say).

Comment author: Daniel_Burfoot 10 March 2017 02:40:36AM *  1 point [-]

positive-sum information-conveying component and a zero-sum social-control/memetic-warfare component.

Style complaint: did you really need to use five hyphenated words in one line in the first sentence?

Comment author: Zack_M_Davis 10 March 2017 04:30:03AM 2 points [-]

Yes.

[Link] An Intuition on the Bayes-Structural Justification for Free Speech Norms

5 Zack_M_Davis 09 March 2017 03:15AM
Comment author: gjm 07 March 2017 03:22:50AM 2 points [-]

If most trans women disagree then that seems to me like pretty good reason to doubt the model.

(Not conclusive, of course; people can be wrong about themselves, which I think is the correct version of the overstatement "psychology is about invalidating people's identities".)

explains so much of what I see in myself and what I see in other people [...]

So, you are uninterested in everyone else's anecdotal evidence because it contradicts your own anecdotal evidence? :-)

Less uncharitably: perhaps I'm misunderstanding something here, but it seems to me that we need to distinguish between a weaker claim (W): "There are trans women who fit well into each of the categories in Blanchard's typology" and a stronger claim (S): "Blanchard's typology gives a correct description of essentially all trans women". (Where for "Blanchard's typology" you should feel free to substitute some more accurate term that reflects more recent work along similar lines.)

If what you've seen in yourself and those around you fits well (or at least seems to) with Blanchard's typology, that's good evidence for W, but it's not such good evidence for S. If other people say it badly fails to fit their experience, that's good evidence against S, but not much evidence against W. It seems like the obvious conclusion would be that probably W is right and S is wrong; that there are plenty of people who fit into the typology but also plenty who don't.

Am I missing something important?

In response to comment by gjm on Am I Really an X?
Comment author: Zack_M_Davis 07 March 2017 04:15:38AM 7 points [-]

Am I missing something important?

Yes; people in general are really really shockingly bad at self-reporting. People don't know why they do things; they just notice themselves doing things and then tell a self-serving story about why they did the right things.

For example, prominent trans activist (and autogynephilia theory critic) Julia Serano writes (Whipping Girl, p. 84):

There was also a period of time when I embraced the word "pervert" and viewed my desire to be female as some sort of sexual kink. But after exploring that path, it became obvious that explanation could not account for the vast majority of instances when I thought about being female in a nonsexual context.

I trust that Julia Serano is telling the truth about her subjective experiences. But "it became obvious that explanation could not account for" is not an experience. It's a hypothesis about human psychology. I don't expect anyone to get that kind of thing right based on introspection alone!

Again, it's very important to emphasize that I'm not saying that non-exclusively-androphilic trans women who deny autogynephilia are particularly delusional. I'm saying that basically everyone is basically that delusional about basically everything!

Comment author: Gram_Stone 07 March 2017 12:58:50AM 0 points [-]

I was familiar with this.

I find the first etiology similar to my model. Did you mean to imply this similarity by use of the word 'indeed'? I can see how one might interpret my model as an algorithm that outputs a little 'gender token' black box that directly causes the self-reports, but I really didn't mean to propose anything besides "Once gendered behavior has been determined, however that occurs, cisgender males don't say "I'm a boy!" for cognitive reasons that are substantially different from the reasons that transgender males say "I'm a boy!" " Writing things like "behaviorally-masculine girls" just sounds like paraphrase to me. Should it not? On the other hand, as I understand it, the second etiology substantially departs from this. In that case it is proposed that transgender people that transition later in life perform similar behaviors for entirely different cognitive reasons.

I'll reiterate that I admit to the plausibility of other causes of self-report. I do find your confidence surprising, however. I realize the visible controversy is not much evidence that you're wrong, because we would expect controversy either way. Do you have thoughts on Thoughts on The Blanchard/Bailey Distinction? I'd just like to read them if they exist.

Comment author: Zack_M_Davis 07 March 2017 03:27:15AM 8 points [-]

I was familiar with this.

Yup. This is a case (I can think of one more, but I'll let that be someone else's crusade) where we have the correct theory in the psychology literature, and all the nice smart socially-liberal people have heard of the theory, but they think to themselves, "Oh, but only bad outgroup people could believe something crazy like that; it's just some guy's theory; it probably isn't actually true."

Surprise! Everyone is lying! Everyone is lying because telling the truth would be politically inconvenient!

I find the first etiology similar to my model. [...] Writing things like "behaviorally-masculine girls" just sounds like paraphrase to me.

Similar, but the key difference is that I claim that there's no atomic "identity": whether a very behaviorally-masculine girl grows up to identify as a "butch lesbian" or "trans man" is mostly going to depend on the details of her/his cultural environment and the incentives she/he faces.

Do you have thoughts on Thoughts on The Blanchard/Bailey Distinction?

My reply.

I do find your confidence surprising

I agree that I probably look insanely confident! What's going on here—

(personal explanation of why I'm investing so much effort into being such an asshole screaming bloody murder about this at every possible opportunity follows; if you're just interested in the science, read the science; don't pay attention to me)

—is that I spent ten years having (mild, manageable) gender problems, all the while thinking, "Oh, but this is just autogynephilia; that can't be the same as actually being trans, because every time I use that word in public, everyone says 'That's completely unsupported transphobic nonsense', so I must just be some weird non-trans outlier; oh, well."

... and then, I moved to Berkeley. I met a lot of trans women, who seem a lot like me along many dimensions. People who noticed the pattern started to tell me that they thought I was probably trans.

And I was like, "I agree that it seems plausible that I have a similar underlying psychological condition, and I'm definitely very jealous of all of our friends who get their own breasts and get refered to as she, but my thing looks pretty obviously related to my paraphilic sexuality and it's not at all obvious to me that full-time social transition is the best quality-of-life intervention when you take into account the serious costs and limitations of the existing technology. After all, I am biologically male and have received male socialization and you can use these facts to make probabilistic inferences about my psychology; I don't expect anyone to pretend not to notice. If some actual biologically-female women don't want people like me in their spaces, that seems like a legimate desire that I want to respect, even if other people make different choices."

And a lot of people are like, "Oh, that's just internalized transphobia; you're obviously a trans woman; we already know that transitioning is the correct quality-of-life intervention. Don't worry about invading women's spaces; Society has decided that you have a right to be a woman if you want."

And I'm like, "Okay, it's certainly possible that you're right about the optimal social conventions and quality-of-life interventions surrounding late-onset gender dysphoria in males, but how do you know? Where is the careful cost-benefit calculation that people are using to make these enormous life- and society-altering decisions?"

And no one knows. No one is in control. It's all just memetics and primate signaling games, just like Robin Hanson was trying to tell me for the past ten years, and I verbally agreed, but I didn't see it.

I trusted the Berkeley rationalist community. I trusted that social reality mostly made sense. I was wrong.

I still want to at least experiment with the same drugs everyone else is on. But I have no trust anymore.

Comment author: gjm 07 March 2017 03:10:47AM 2 points [-]

you should downvote it

Downvoting has been disabled for some time.

Comment author: Zack_M_Davis 07 March 2017 03:13:37AM 1 point [-]

Huh, so it is! I've been away for a while!

Comment author: Lumifer 07 March 2017 02:04:11AM 1 point [-]

it's a really valuable (if expensive and painful) exercise ... you might learn something

Those two thoughts don't seem to match well. If you think it's valuable enough to be worth the expense and the pain, presumably you have a better description of the potential payoff other than "learn something"?

Comment author: Zack_M_Davis 07 March 2017 02:20:33AM 2 points [-]

The payoff is the shock of, "Wait! A lot of the customs and ideas that my ingroup thinks are obviously inherently good, aren't actually what I want now that I understand more about the world, and I predict that my ingroup friends would substantially agree if they knew what I knew, but I can't just tell them, because from their perspective it probably just looks like I suddenly went crazy!"

I know, that's still vague. The reason I'm being vague is because the details are going to depend on your ingroup, and which hated outgroup's body of knowledge you chose to study. Sorry about this.

View more: Next