XiXiDu comments on 2011 Survey Results - Less Wrong

94 Post author: Yvain 05 December 2011 10:49AM

You are viewing a comment permalink. View the original post to see all comments and the full post content.

Comments (513)

Sort By: Leading

You are viewing a single comment's thread. Show more comments above.

Comment author: XiXiDu 05 December 2011 09:46:13AM *  25 points [-]

This made my trust in the community and my judgement of its average quality go down a LOT...

I expected almost everyone to agree with Eliezer on most important things...

Alicorn (top-poster) doesn't agree with Eliezer about ethics. PhilGoetz (top-poster) doesn't agree with Eliezer. Wei_Dai (top-poster) doesn't agree with Eliezer on AI issues. wedrifid (top-poster) doesn't agree with Eliezer on CEV and the interpretation of some game and decision theoretic thought experiments.

I am pretty sure Yvain doesn't agree with Eliezer on quite a few things too (too lazy to look it up now).

Generally there are a lot of top-notch people who don't agree with Eliezer. Robin Hanson for example. But also others who have read all of the Sequences, like Holden Karnofsky from GiveWell, John Baez or Katja Grace who has been a visiting fellow.

But even Rolf Nelson (a major donor and well-read Bayesian) disagrees about the Amanda Knox trial. Or take Peter Thiel (SI's top donor) who thinks that the Seasteading Institute deserves more money than the Singularity Institute.

Comment author: Armok_GoB 05 December 2011 02:49:18PM 11 points [-]

I am extremely surprised by this, and very confused. This is strange because I technically knew each of those individual examples... I'm not sure what's going on, but I'm sure that whatever it is it's my fault and extremely unflattering to my ability as a rationalist.

How am I supposed to follow my consensus-trusting heuristics when no consensus exists? I'm to lazy to form my own opinions! :p

Comment author: NancyLebovitz 05 December 2011 04:07:30PM 7 points [-]

I just wait, especially considering that which interpretation of QM is correct doesn't have urgent practical consequences.

Comment author: MatthewBaker 05 December 2011 04:28:00PM 0 points [-]

We just learned that neutrinos might be accelerated faster that light in certain circumstances, while this result doesn't give me too much pause, It certainly made me think about the possible practical consequences of successfully understanding quantum mechanics.

Comment author: NancyLebovitz 05 December 2011 04:32:30PM 0 points [-]

Fair enough. A deeper understanding of quantum mechanics would probably have huge practical consequences.

It isn't obvious to me that figuring out whether the MWI is right is an especially good way to improve understanding of QM. My impression from LW is that MWI is important here for looking at ethical consequences.

Comment author: MatthewBaker 05 December 2011 04:34:58PM *  0 points [-]

I share that impression :) Plus its very fun to think about Everett branches and accusal trade when I pretend we would have a chance against a truly Strong AI in a box.

Comment author: satt 06 December 2011 03:08:09AM 2 points [-]

This is strange because I technically knew each of those individual examples... I'm not sure what's going on,

Sounds like plain old accidental compartmentalization. You didn't join the dots until someone else pointed out they made a line. (Admittedly this is just a description of your surprise and not an explanation, but hopefully slapping a familiar label on it makes it less opaque.)

Comment author: wallowinmaya 05 December 2011 12:26:38PM 6 points [-]

Holden Karnofsky has read all of the Sequences?

Comment author: XiXiDu 05 December 2011 06:39:35PM *  12 points [-]

Holden Karnofsky has read all of the Sequences?

I wrote him an email to make sure. Here is his reply:

I've read a lot of the sequences. Probably the bulk of them. Possibly all of them. I've also looked pretty actively for SIAI-related content directly addressing the concerns I've outlined (including speaking to different people connected with SIAI).

Comment author: beoShaffer 05 December 2011 08:04:27PM 5 points [-]

take Peter Thiel (SI's top donor) who thinks that the Seasteading Institute deserves more money than the Singularity Institute.

IIRC Peter Thiel can't give SIAI more than he currently does without causing some form of tax difficulties, and it has been implied that he would give significantly more if this were not the case.

Comment author: gwern 05 December 2011 08:25:24PM 5 points [-]

Right. I remember the fundraising appeals about this: if Thiel donates too much, SIAI begins to fail the 501c3 regs, that it "receives a substantial part of its income, directly or indirectly, from the general public or from the government. The public support must be fairly broad, not limited to a few individuals or families."