You're looking at Less Wrong's discussion board. This includes all posts, including those that haven't been promoted to the front page yet. For more information, see About Less Wrong.

RaelwayScot comments on Open Thread, January 4-10, 2016 - Less Wrong Discussion

5 Post author: polymathwannabe 04 January 2016 01:06PM

You are viewing a comment permalink. View the original post to see all comments and the full post content.

Comments (430)

You are viewing a single comment's thread.

Comment author: RaelwayScot 06 January 2016 11:17:19AM 5 points [-]

Why does E. Yudkowsky voice such strong priors e.g. wrt. the laws of physics (many worlds interpretation), when much weaker priors seem sufficient for most of his beliefs (e.g. weak computationalism/computational monism) and wouldn't make him so vulnerable? (With vulnerable I mean that his work often gets ripped apart as cultish pseudoscience.)

Comment author: Viliam 07 January 2016 11:49:27AM *  7 points [-]

You seem to assume that MWI makes the Sequences more vulnerable; i.e. that there are people who feel okay with the rest of the Sequences, but MWI makes them dismiss it as pseudoscience.

I think there are other things that rub people the wrong way (that EY in general talks about some topics more than appropriate for his status, whether it's about science, philosophy, politics, or religion) and MWI is merely the most convenient point of attack (at least among those people who don't care about religion). Without MWI, something else would be "the most controversial topic which EY should not have added because it antagonizes people for no good reason", and people would speculate about the dark reasons that made EY write about that.

For context, I will quote the part that Yvain quoted from the Sequences:

Everyone should be aware that, even though I’m not going to discuss the issue at first, there is a sizable community of scientists who dispute the realist perspective on QM. Myself, I don’t think it’s worth figuring both ways; I’m a pure realist, for reasons that will become apparent. But if you read my introduction, you are getting my view. It is not only my view. It is probably the majority view among theoretical physicists, if that counts for anything (though I will argue the matter separately from opinion polls). Still, it is not the only view that exists in the modern physics community. I do not feel obliged to present the other views right away, but I feel obliged to warn my readers that there are other views, which I will not be presenting during the initial stages of the introduction.

Everyone please make your own opinion about whether this is how cult leaders usually speak (because that seems to be the undertone of some comments in this thread).

Comment author: Kaj_Sotala 09 January 2016 01:20:00PM *  6 points [-]

My model of him has him having an attitude of "if I think that there's a reason to be highly confident of X, then I'm not going to hide what's true just for the sake of playing social games".

Comment author: IlyaShpitser 06 January 2016 04:54:21PM *  7 points [-]

Because he was building a tribe. (He's done now).


edit: This should actually worry people a lot more than it seems to.

Comment author: knb 06 January 2016 09:52:24PM 2 points [-]

I think LW is skewed toward believing in MWI because they've all read Yudkowsky. It really doesn't seem likely Yudkowsky just gleaned MWI was already popular and wrote about it to pander to the tribe. In any case I don't really see why MWI would be a salient point for group identity.

Comment author: IlyaShpitser 07 January 2016 12:03:42AM *  2 points [-]

That's not what I am saying. People didn't write the Nicene Creed to pander to Christians. (Sorry about the affect side effects of that comparison, that wasn't my intention, just the first example that came to mind).

MWI is perfect for group identity -- it's safely beyond falsification, and QM interpretations are a sufficiently obscure topic where folks typically haven't thought a lot about it. So you don't get a lot of noise in the marker.

But I am not trying to make MWI into more than it is. I don't think MWI is a centrally important idea, it's mostly an illustration of what I think is going on (also with some other ideas).

Comment author: Lumifer 06 January 2016 05:13:44PM *  2 points [-]

This should actually worry people a lot more

Why?

Comment author: IlyaShpitser 06 January 2016 05:18:18PM 3 points [-]

Consider that if stuff someone says resonates with you, that someone is optimizing for that.

Comment author: Lumifer 06 January 2016 05:27:37PM *  2 points [-]

There are two quite different scenarios here.

In scenario 1 that someone knows me beforehand and optimizes what he says to influence me.

In scenario 2 that someone doesn't know who will respond, but is optimizing his message to attract specific kinds of people.

The former scenario is a bit worrisome -- it's manipulation. But the latter one looks fairly benign to me -- how else would you attract people with a particular set of features? Of course the message is, in some sense, bait but unless it's poisoned that shouldn't be a big problem.

Comment author: IlyaShpitser 06 January 2016 05:30:21PM 0 points [-]

MIRI survives in part via donations from people who bought the party line on stuff like MWI.

Comment author: Lumifer 06 January 2016 05:44:40PM *  2 points [-]

A fair point. Maybe I'm committing the typical mind fallacy and underestimating the general gullibility of people. If someone offers you something, it's obvious to me that you should look for strings, consider the incentives of the giver, and ponder the consequences (including those concerning your mind). If you don't understand why something is given to you, it's probably wise to delay grabbing the cheese (or not touching it) until you understand.

And still this all looks to me like a plain-vanilla example of a bootstrapping an organization and creating a base of support, financial and otherwise, for it. Unless you think there were lies, misdirections, or particularly egregious sins of omission, that's just how the world operates.

Comment author: RichardKennaway 07 January 2016 08:30:10AM 1 point [-]

Also, anyone who succeeds in attracting people to an enterprise, be it by the most impeccable of means, will find the people they have assembled creating tribal markers anyway. The leader doesn't have to give out funny hats. People will invent their own.

Comment author: IlyaShpitser 07 January 2016 04:07:40PM *  1 point [-]

People do a lot of things. Have biases, for example.

There is quite a bit of our evolutionary legacy it would be wise to deemphasize. Not like there aren't successful examples of people doing good work in common and not being a tribe.


edit: I think what's going on is a lot of the rationalist tribe folks are on the spectrum and/or "nerdy", and thus have a more difficult time forming communities, and LW/etc was a great way for them to get something important in their life. They find it valuable and rightly so. They don't want to give it up.

I am sympathetic to this, but I think it would be wise to separate the community aspects and rationality itself as a "serious business." Like, I am friends with lots of academics, but the academic part of our relationship has to be kept separate (I would rip into their papers in peer review, etc.) The guru/disciple dynamic I think is super unhealthy.

Comment author: ChristianKl 06 January 2016 05:52:03PM 2 points [-]

MIRI survives in part via donations from people who bought the party line on stuff like MWI.

Are you saying that based on having looked at the data? I think we should have a census that has numbers about donations for MIRI and belief in MWI.

Comment author: Vaniver 06 January 2016 09:24:25PM 1 point [-]

Really, you would want MWI belief delta (to before they found LW) to measure "bought the party line."

Comment author: IlyaShpitser 07 January 2016 12:05:40AM 1 point [-]

I am not trying to emphasize MWI specifically, it's the whole set of tribal markers together.

Comment author: bogus 07 January 2016 04:07:54AM *  2 points [-]

If there is a tribal marker, it's not MWI persay; it's choosing an interpretation of QM on grounds of explanatory parsimony. Eliezer clearly believed that MWI is the only interpretation of QM that qualifies on such grounds. However, such a belief is quite simply misguided; it ignores several other formulations, including e.g. relational quantum mechanics, the ensemble interpretation, the transactional interpretation, etc. that are also remarkable for their overall parsimony. Someone who advocated for one of these other approaches would be just as recognizable as a member of the rationalist 'tribe'.

Comment author: Dagon 03 February 2016 05:59:10PM 0 points [-]

I don't know why scenario 2 should be any less worrisome. The distinction between "optimized for some perception/subset of you" and "optimized for someone like you" is completely meaningless.

Comment author: Lumifer 03 February 2016 06:08:50PM 0 points [-]

Because of degree of focus. It's like the distinction between a black-hat scanning the entire 'net for vulnerabilities and a black-hat scanning specifically your system for vulnerabilities. Are the two equally worrisome?

Comment author: Dagon 04 February 2016 01:33:56AM 0 points [-]

equally worrisome, conditional on me having the vulnerability the blackhat is trying to use. This is equivalent to the original warning being conditional on something resonating with you.

Comment author: Clarity 07 January 2016 05:45:57AM 0 points [-]

Because warning against dark side rationality with dark side rationality to find light side rationalists doesn't look good against the perennial c-word claims against LW...

Comment author: Clarity 07 January 2016 05:41:59AM *  0 points [-]

Consequentialist ethic

Comment author: ChristianKl 06 January 2016 11:48:44AM 2 points [-]

Given the way the internet works bloggers who don't take strong stances don't get traffic. If Yudkowsky wouldn't have took positions confidently, it's likely that he wouldn't have founded LW as we know it.

Shying away from strong positions for the sake of not wanting to be vulnerable is no good strategy.

Comment author: username2 06 January 2016 12:30:12PM 0 points [-]

I don't agree with this reasoning. Why not write clickbait then if the goal is to drive traffic?

Comment author: ChristianKl 06 January 2016 01:36:25PM *  2 points [-]

I don't think the goal is to drive traffic. It's also to have an impact on the person who reads the article. If you want a deeper look at the strategy look at Nassim Taleb is quite explicit about the principle in Antifragile.

I don't think that Elizers public and private beliefs differ on the issues that RaelwayScot mentioned. A counterfactual world where Eliezer would be a vocal about his beliefs wouldn't have ended up with LW as we know it.

Comment author: Clarity 07 January 2016 05:51:41AM 1 point [-]

its a balancing act

Comment author: hairyfigment 06 January 2016 07:19:00PM 0 points [-]

Actually, I can probably answer this without knowing exactly what you mean: the notion of improved Solomonoff Induction that gets him many-worlds seems like an important concept for his work with MIRI.

I don't know where "his work often gets ripped apart" for that reason, but I suspect they'd object to the idea of improved/naturalized SI as well.

Comment author: IlyaShpitser 06 January 2016 09:24:19PM *  3 points [-]

His work doesn't get "ripped apart" because he doesn't write or submit for peer review.

Comment author: Clarity 07 January 2016 05:52:08AM 0 points [-]

inductive bias

Comment author: hairyfigment 06 January 2016 06:46:11PM 0 points [-]

The Hell do you mean by "computational monism" if you think it could be a "weaker prior"?