Edit, May 21, 2012: Read this comment by Yvain.

Forming your own opinion is no more necessary than building your own furniture.

- Peter de Blanc

There's been a lot of talk here lately about how we need better contrarians. I don't agree. I think the Sequences got everything right and I agree with them completely. (This of course makes me a deranged, non-thinking, Eliezer-worshiping fanatic for whom the singularity is a substitute religion. Now that I have   admitted   this, you don't have to point it out a dozen times in the comments.) Even the controversial things, like:

  • I think the many-worlds interpretation of quantum mechanics is the closest to correct and you're dreaming if you think the true answer will have no splitting (or I simply do not know enough physics to know why Eliezer is wrong, which I think is pretty unlikely but not totally discountable).
  • I think cryonics is a swell idea and an obvious thing to sign up for if you value staying alive and have enough money and can tolerate the social costs.
  • I think mainstream science is too slow and we mere mortals can do better with Bayes.
  • I am a utilitarian consequentialist and think that if allow someone to die through inaction, you're just as culpable as a murderer.
    • I completely accept the conclusion that it is worse to put dust specks in 3^^^3 people's eyes than to torture one person for fifty years. I came up with it independently, so maybe it doesn't count; whatever.
  • I tentatively accept Eliezer's metaethics, considering how unlikely it is that there will be a better one (maybe morality is in the gluons?)
  • "People are crazy, the world is mad," is sufficient for explaining most human failure, even to curious people, so long as they know the heuristics and biases literature.
  • Edit, May 27, 2012: You know what? I forgot one: Gödel, Escher, Bach is the best.

There are two tiny notes of discord on which I disagree with Eliezer Yudkowsky. One is that I'm not so sure as he is that a rationalist is only made when a person breaks with the world and starts seeing everybody else as crazy, and two is that I don't share his objection to creating conscious entities in the form of an FAI or within an FAI. I could explain, but no one ever discusses these things, and they don't affect any important conclusions. I also think the sequences are badly-organized and you should just read them chronologically instead of trying to lump them into categories and sub-categories, but I digress.

Furthermore, I agree with every essay I've ever read by Yvain, I use "believe whatever gwern believes" as a heuristic/algorithm for generating true beliefs, and don't disagree with anything I've ever seen written by Vladimir Nesov, Kaj Sotala, Luke Muelhauser, komponisto, or even Wei Dai; policy debates should not appear one-sided, so it's good that they don't.

I write this because I'm feeling more and more lonely, in this regard. If you also stand by the sequences, feel free to say that. If you don't, feel free to say that too, but please don't substantiate it. I don't want this thread to be a low-level rehash of tired debates, though it will surely have some of that in spite of my sincerest wishes.

Holden Karnofsky  said:

I believe I have read the vast majority of the Sequences, including the AI-foom debate, and that this content - while interesting and enjoyable - does not have much relevance for the arguments I've made.

I can't understand this. How could the sequences  not  be relevant? Half of them were created when Eliezer was thinking about AI problems.

So I say this, hoping others will as well:
I stand by the sequences.

And with that, I tap out.  I have found the answer, so I am leaving the conversation.

Even though I am not important here, I don't want you to interpret my silence from now on as indicating compliance.

After some degree of thought and nearly 200 comment replies on this article, I regret writing it. I was insufficiently careful, didn't think enough about how it might alter the social dynamics here, and didn't spend enough time clarifying, especially regarding the third bullet point. I also dearly hope that I have not entrenched anyone's positions, turning them into allied soldiers to be defended, especially not my own. I'm sorry.

New to LessWrong?

New Comment
250 comments, sorted by Click to highlight new comments since: Today at 12:11 PM
Some comments are truncated due to high volume. (⌘F to expand all)Change truncation settings

and don't disagree with anything I've ever seen written by Vladimir Nesov, Kaj Sotala, Luke Muelhauser, komponisto, or even Wei Da

This confuses me since these people are not in agreement on some issues.

This confuses me as well, especially since I was a major contributor to "talk here lately about how we need better contrarians" which the OP specifically disagreed with.

Where you and those other fellows disagree are typically on policy questions, where I tend to not have any strong opinions at all. (Thus, "don't disagree".) If you will point to a specific example where you and one of those other fellows explicitly disagree on a factual question (or metaethical question if you don't consider that a subset of factual questions) I will edit my comment.

Addendum: I agree more with you than Eliezer about what to do, especially re: Some Thoughts and Wanted: Backup Plans.

There's a comparison in the back of my mind that may be pattern matching to religious behavior too closely: when one talks to some Orthodox Jews, some of them will claim that they believe everything that some set of major historical rabbis said (say Maimonides and Rashi) when one can easily point to issues where those rabbis disagreed. Moreover, they often use examples like Maimonides or Ibn Ezra who had positions that are in fact considered outright heretical by much of Orthodox Judaism today. I've seen a similar result with Catholics and their theologians. In both cases, the more educated members of the religion seem less inclined to do so, but even the educated members sometimes make such claims albeit with interesting doublethink to justify how the positions really aren't contradictory.

It may be in those cases that what may be going on is that saying "I agree with this list of people and have never seen them as wrong" is a statement of tribal affiliation by saying one agrees with various high status people in the tribe. It is possible that something similar is happening in this context. Alternatively, it may just indicate that Grognor hasn't read that much by you or by some of the other people on the list.

8Randaly12y
I believe that by "disagree," Grognor meant to say that he could not refute or disprove anything they've ever written.

"Think for yourself" sounds vaguely reasonable only because of the abominable incompetence of those tasked with thinking for us.

-- Steven Kaas

I write this because I'm feeling more and more lonely, in this regard. If you also stand by the sequences, feel free to say that. If you don't, feel free to say that too, but please don't substantiate it. I don't want this thread to be a low-level rehash of tired debates, though it will surely have some of that in spite of my sincerest wishes.

Why must we stand-by or stand-away-from? Personally, I lean towards the Sequences. Do you really need to feel lonely unless others affirm every single doctrine?

I think the many-worlds interpretation of quantum mechanics is the closest to correct and you're dreaming if you think the true answer will have no splitting (or I simply do not know enough physics to know why Eliezer is wrong, which I think is pretty unlikely but not totally discountable).

I accept the MWI of QM as "empirically adequate"; no more, no less.

I think cryonics is a swell idea and an obvious thing to sign up for if you value staying alive and have enough money and can tolerate the social costs.

Cryonics is interesting and worth considering, but the probabilities invovled are so low that it is not at all obvious it is a net win after factoring in signallin... (read more)

You seem to mostly disagree in spirit with all Grognor's points but the last, though on that point you didn't share your impression of the H&B literature.

I'll chime in and say that at some point about two years ago I would have more or less agreed with all six points. These days I disagree in spirit with all six points and with the approach to rationality that they represent. I've learned a lot in the meantime, and various people, including Anna Salamon, have said that I seem like I've gained fifteen or twenty IQ points. I've read all of Eliezer's posts maybe three times over and I've read many of the cited papers and a few books, so my disagreement likely doesn't stem from not having sufficiently appreciated Eliezer's sundry cases. Many times when I studied the issues myself and looked at a broader set of opinions in the literature, or looked for justifications of the unstated assumptions I found, I came away feeling stupid for having been confident of Eliezer's position: often Eliezer had very much overstated the case for his positions, and very much ignored or fought straw men of alternative positions.

His arguments and their distorted echoes lead one to think that various p... (read more)

9Emile12y
Does anybody actually dispute that? For what it's worth, I don't hold that position, and it seems much more prevalent in atheist forums than on LessWrong.
7JoshuaZ12y
Is it less prevalent here or is it simply less vocal because people here aren't spending their time on that particularly tribal demonstration? After all, when you've got Bayesianism, AI risk, and cognitive biases, you have a lot more effective methods of signaling allegiance to this narrow crowd.
2Eugine_Nier12y
Well we have openly religious members of our 'tribe'.
8JoshuaZ12y
Clear minority, and most comments defending such views are voted down. With the exception of Will, no one in that category is what would probably be classified as high status here, and even Will's status is... complicated.
3Will_Newsome12y
Also I'm not religious in the seemingly relevant sense.
2Eugine_Nier12y
Well this post is currently at +6.
6Will_Newsome12y
Depends on what connotations are implied. There are certainly people who dispute, e.g., the (practical relevance of the) H&B results on confirmations bias, overconfidence, and so on that LessWrong often brings up in support of the "the world is mad" narrative. There are also people like Chesterton who placed much faith in the common sense of the average man. But anyway I think the rest of the sentence needs to be included to give that fragment proper context. Granted.
4CuSithBell12y
Could you point towards some good, coherent arguments for supernatural phenomena or the like?
0Will_Newsome12y
Analyzing the sun miracle at Fatima seems to be a good starting point. This post has been linked from LessWrong before. Not an argument for the supernatural, but a nexus for arguments: it shows what needs to be explained, by whatever means. Also worth keeping in mind is the "capricious psi" hypothesis, reasonably well-explicated by J. E. Kennedy in a few papers and essays. Kennedy's experience is mostly in parapsychology. He has many indicators in favor of his credibility: he has a good understanding of the relevant statistics, he exposed some fraud going on in a lab where he was working, he doesn't try to hide that psi if it exists would seem to have weird and seemingly unlikely properties, et cetera. But I don't know of any arguments that really go meta and take into account how the game theory and psychology of credibility might be expected to affect the debate, e.g., emotional reactions to people who look like they're trying to play psi-of-the-gaps, both sides' frustration with incommunicable evidence or even the concept of incommunicable evidence, and things like that.
9CuSithBell12y
Hm. This... doesn't seem particularly convincing. So it sounds like whatever convinced you is incommunicable - something that you know would be unconvincing to anyone else, but which is still enough to convince you despite knowing the alternate conclusions others would come to if informed of it?
6Will_Newsome12y
Agreed. The actually-written-up-somewhere arguments that I know of can at most move supernaturalism from "only crazy or overly impressionable people would treat it as a live hypothesis" to "otherwise reasonable people who don't obviously appear to have a bottom line could defensibly treat it as a Jamesian live hypothesis". There are arguments that could easily be made that would fix specific failure modes, e.g. some LW folk (including I think Eliezer and lukeprog) mistakenly believe that algorithmic probability theory implies a low prior for supernaturalism, and Randi-style skeptics seem to like fully general explanations/counterarguments too much. But once those basic hurdles are overcome there still seems to be a wide spread of defensible probabilities for supernaturalism based off of solely communicable evidence. Essentially, yes.
3steven046112y
Is the point here that supernatural entities that would be too complex to specify into the universe from scratch may have been produced through some indirect process logically prior to the physics we know, sort of like humans were produced by evolution? Or is it something different?
-2Will_Newsome12y
Alien superintelligences are less speculative and emerge naturally from a simple universe program. More fundamentally the notion of simplicity that Eliezer and Luke are using is entirely based off of their assessments of which kinds of hypotheses have historically been more or less fruitful. Coming up with a notion of "simplicity" after the fact based on past observations is coding theory and has nothing to do with the universal prior, which mortals simply don't have access to. Arguments should be about evidence, not "priors".
0siodine12y
... It isn't technically a universal prior, but it counts as evidence because it's historically fruitful. That leaves you with a nitpick rather than showing "LW folk (including I think Eliezer and lukeprog) mistakenly believe that algorithmic probability theory implies a low prior for supernaturalism."
0Will_Newsome12y
I don't think it's nitpicking as such to point out that the probability of supernaturalism is unrelated to algorithmic probability. Bringing in Kolmogorov complexity is needlessly confusing, and even Bayesian probability isn't necessary because all we're really concerned with is the likelihood ratio. The error I want to discourage is bringing in confusing uncomputable mathematics for no reason and then asserting that said mathematics somehow justify a position one holds for what are actually entirely unrelated reasons. Such errors harm group epistemology.
1siodine12y
I don't see how you've done that. If KC isn't a universal prior like objective isn't technically objective but inter-subjective then you can still use KC as evidence for a class of propositions (and probably the only meaningful class of propositions). For that class of propositions you have automatic evidence for or against them (in the form of KC), and so it's basically a ready-made prior because it passes from the posterior to the prior immediately anyway. So a) you think LWers reasons for not believing in supernaturalism have nothing to do with KC, and b) you think supernaturalism exists outside the class of propositions KC can count as evidence as for or against? I don't care about A, but If B is your position I wonder: why?
2CuSithBell12y
That's a shame. Any chance you might have suggestions on how to go about obtaining such evidence for oneself? Possibly via PM if you'd be more comfortable with that.
5Will_Newsome12y
I have advice. First off, if psi's real then I think it's clearly an intelligent agent-like or agent-caused process. In general you'd be stupid to mess around with agents with unknown preferences. That's why witchcraft was considered serious business: messing with demons is very much like building mini uFAIs. Just say no. So I don't recommend messing around with psi, especially if you haven't seriously considered what the implications of the existence of agent-like psi would be. This is why I like the Catholics: they take things seriously, it's not fun and games. "Thou shalt not tempt the Lord thy God." If you do experiment, pre-commit not to tell anyone about at least some predetermined subset of the results. Various parapsychology experiments indicate that psi effects can be retrocausal, so experimental results can be determined by whether or not you would in the future talk about them. If psi's capricious then pre-commiting not to blab increases likelihood of significant effects.
6Eugine_Nier12y
I just thought of something. What you're saying is that psi effects are anti-inductive.
0bogus12y
The capricious-psi literature actually includes several proposed mechanisms which could lead to "anti-inductive" psi. Some of these mechanisms are amenable to mitigation strategies (such as not trying to use psi effects for material advantage, and keeping one's experiments confidential); others are not.
-2Will_Newsome12y
Indeed.
8Eugine_Nier12y
Ok, I feel like we should now attempt to work out a theory of psi caused by some kind of market-like game theory among entities.
1CuSithBell12y
Thanks for the advice! Though I suppose I won't tell you if it turns out to have been helpful?
0r_claypool12y
As lukeprog says here.
0Eugine_Nier12y
I don't entirely agree with Will here. My issue is that there seem to be some events, e.g., Fatima, where the best "scientific explanation" is little better than the supernatural wearing a lab-coat.
2CuSithBell12y
Are there any good supernatural explanations for that one?! Because "Catholicism" seems like a pretty terrible explanation here.
4Eugine_Nier12y
Why? Do you have a better one? (Note: I agree "Catholicism" isn't a particularly good explanation, it's just that it's not noticeably worse than any other.)
2CuSithBell12y
I mentioned Catholicism only because it seems like the "obvious" supernatural answer, given that it's supposed to be a Marian apparition. Though, I do think of Catholicism proper as pretty incoherent, so it'd rank fairly low on my supernatural explanation list, and well below the "scientific explanation" of "maybe some sort of weird mundane light effect, plus human psychology, plus a hundred years". I haven't really investigated the phenomenon myself, but I think, say, "the ghost-emperor played a trick" or "mass hypnosis to cover up UFO experiments by the lizard people" rank fairly well compared to Catholicism.
2Eugine_Nier12y
This isn't really an explanation so much as clothing our ignorance in a lab coat.
2JoshuaZ12y
It does a little more than that. It points to a specific class of hypotheses where we have evidence that in similar contexts such mechanisms can have an impact. The real problem here is that without any ability to replicate the event, we're not going to be able to get substantially farther than that.
0CuSithBell12y
Yeah, it's not really an explanation so much as an expression of where we'd look if we could. Presumably the way to figure it out is to either induce repeat performances (difficult to get funding and review board approval, though) or to study those mechanisms further. I suspect that'd be more likely to help than reading about ghost-emperors, at least.
2Nornagest12y
Quite. Seems to me that if we're going to hold science to that standard, we should be equally or more critical of ignorance in a cassock; we should view religion as a competing hypothesis that needs to be pointed to specifically, not as a reassuring fallback whenever conventional investigation fails for whatever reason. That's a pretty common flaw of theological explanations, actually.
2siodine12y
Disagree in spirit? What exactly does that mean? (I happen to mostly agree with your comment while mostly agreeing with Grognor's points--hence my confusion in what you mean, exactly.)
9Will_Newsome12y
Hard to explain. I'll briefly go over my agreement/disagreement status on each point. MWI: Mixed opinion. MWI is a decent bet, but then again that's a pretty standard opinion among quantum physicists. Eliezer's insistence that MWI is obviously correct is not justified given his arguments: he doesn't address the most credible alternatives to MWI, and doesn't seem to be cognizant of much of the relevant work. I think I disagree in spirit here even though I sort of agree at face value. Cryonics: Disagree, nothing about cryonics is "obvious". Meh science, Yay Bayes!: Mostly disagree, too vague, little supporting evidence for face value interpretation. I agree that Bayes is cool. Utilitarianism: Disagree, utilitarianism is retarded. Consequentialism is fine, but often very naively applied in practice, e.g. utilitarianism. Eliezer's metaethics: Disagree, especially considering Eliezer's said he thinks he's solved meta-ethics, which is outright crazy, though hopefully he was exaggerating. "'People are crazy, the world is mad' is sufficient for explaining most human failure, even to curious people, so long as they know the heuristics and biases literature": Mostly disagree, LW is much too confident in the heuristics and biases literature and it's not nearly a sufficient explanation for lots of things that are commonly alleged to be irrational.

Disagree, utilitarianism is retarded.

When making claims like this, you need to do something to distinguish yourself from most people who make such claims, who tend to harbor basic misunderstandings, such as an assumption that preference utilitarianism is the only utilitarianism.

Utilitarianism has a number of different features, and a helpful comment would spell out which of the features, specifically, is retarded. Is it retarded to attach value to people's welfare? Is it retarded to quantify people's welfare? Is it retarded to add people's welfare linearly once quantified? Is it retarded to assume that the value of structures containing more than one person depends on no features other than the welfare of those persons? And so on.

7Will_Newsome12y
I suppose it's easiest for me to just make the blanket metaphilosophical claim that normative ethics without well-justified meta-ethics just aren't a real contender for the position of actual morality. So I'm unsatisfied with all normative ethics. I just think that utilitarianism is an especially ugly hack. I dislike fake non-arbitrariness.
-2Will_Newsome12y
— Regina Spektor, The Calculation
-2Will_Newsome12y
Perhaps I show my ignorance. Pleasure-happiness and preference fulfillment are the only maximands I've seen suggested by utilitarians. A quick Google search hasn't revealed any others. What are the alternatives? I'm unfortunately too lazy to make my case for retardedness: I disagree with enough of its features and motivations that I don't know where to begin, and I wouldn't know where to end.
5steven046112y
Eudaimonia. "Thousand-shardedness". Whatever humans' complex values decide constitutes an intrinsically good life for an individual. It's possible that I've been mistaken in claiming that, as a matter of standard definition, any maximization of linearly summed "welfare" or "happiness" counts as utilitarianism. But it seems like a more natural place to draw the boundary than "maximization of either linearly summed preference satisfaction or linearly summed pleasure indicators in the brain but not linearly summed eudaimonia".
0Will_Newsome12y
That sounds basically the same as was what I'd been thinking of as preference utilitarianism. Maybe I should actually read Hare. What's your general approach to utilitarianism's myriad paradoxes and mathematical difficulties?

he doesn't address the most credible alternatives to MWI

I don't think you need to explicitly address the alternatives to MWI to decide in favor of MWI. You can simply note that all interpretations of quantum mechanics either 1) fail to specify which worlds exist, 2) specify which worlds exist but do so through a burdensomely detailed mechanism, 3) admit that all the worlds exist, noting that worlds splitting via decoherence is implied by the rest of the physics. Am I missing something?

0TAG7mo
If "all the worlds" includes the non classical worlds, MWI is observationally false. Whether and how decoherence produces classical worlds is a topic of ongoing research.
-2Will_Newsome12y
Is that a response to my point specifically or a general observation? I don't think "simply noting" is nearly enough justification to decide strongly in favor of MWI—maybe it's enough to decide in favor of MWI, but it's not enough to justify confident MWI evangelism nor enough to make bold claims about the failures of science and so forth. You have to show that various specific popular interpretations fail tests 1 and 2. ETA: Tapping out because I think this thread is too noisy.
0steven046112y
I suppose? It's hard for me to see how there could even theoretically exist a mechanism such as in 2 that failed to be burdensome. But maybe you have something in mind?
1Eugine_Nier12y
It always seems that way until someone proposes a new theoretical framework, afterwards it seems like people were insane for not coming up with said framework sooner. Well the Transactional Interpretation for example.
4steven046112y
That would have been my guess. I don't really understand the transactional interpretation; how does it pick out a single world without using a burdensomely detailed mechanism to do so?
5buybuydandavis12y
I'm even more confused that people seem to think it quite natural to spend years debating the ethical positions of someone watching the debate. A little creative editing with Stirner makes for a catchy line in this regard:
1Manfred12y
Luke's sequence of posts on this may help. Worth a shot at least :)
4bryjnar12y
Which he never finished. I've been waiting for Luke to actually get to some substantive metaethics right back since he was running Common Sense Atheism ;)
[-][anonymous]12y300

I think the Sequences got everything right

That is quite a bit of conjunction you've got going on there. Rather extraordinary if it is true, I've yet to see appropriately compelling evidence of this. Based on what evidence I do see I think the sequences, at least the ones I've read so far, are probably "mostly right", interesting and perhaps marginally useful to very peculiar kinds of people for ordering their lives.

I also think the sequences are badly-organized and you should just read them chronologically instead of trying to lump them into categories and sub-categories, but I digress.

I think I agree with this.

-1Grognor12y
The error in your comment is that the sequences were all created by a few reliable processes, so it's as much of a conjunction fallacy as "My entire leg will function." Note that this also means that if one of the articles in the Sequences is wrong, it doesn't even mean Eliezer has made a grievous mistake. I have Nick Tarleton to thank for this insight, but I can't find where he originally said it.
0JoshuaZ12y
Even professional runners will occasionally trip. Even Terry Tao occasionally makes a math error. The point isn't that even highly reliable processes are likely to output some bad ideas over the long term.
0Will_Newsome12y
I think it's a Kaasism.
-5Grif12y

Many-worlds is a clearly explicable interpretation of quantum mechanics and dramatically simpler than the Copenhagen interpretation revered in the mainstream. It rules out a lot of the abnormal conclusions that people draw from Copenhagen, e.g. ascribing mystical powers to consciousness, senses, or instruments. It is true enough to use as a model for what goes on in the world; but it is not true enough to lead us to any abnormal beliefs about, e.g., morality, "quantum immortality", or "possible worlds" in the philosophers' sense.

Cryonics is worth developing. The whole technology does not exist yet; and ambitions to create it should not be mistaken for an existing technology. That said, as far as I can tell, people who advocate cryonic preservation aren't deluded about this.

Mainstream science is a social institution commonly mistaken for an epistemology. (We need both. Epistemologies, being abstractions, have a notorious inability to provide funding.) It is an imperfect social institution; reforms to it are likely to come from within, not by abolishing it and replacing it with some unspecified Bayesian upgrade. Reforms worth supporting include performing and publ... (read more)

the Copenhagen interpretation revered in the mainstream

Is it? I think that the most widely accepted interpretation among physicists is the shut-up-and-calculate interpretation.

3Cthulhoo12y
There are quite a few people that actively do research and debate on QM foundations, and, among that group, there's honestly no preferred interpretation. People are even looking for alternatives that bypass the problem entirely (e.g. GRW. The debate is fully open, at the moment. Outside of this specific field, yes, it's pretty much shut-up-and-calculate.
2Thomas12y
Yes, doing the calculation and getting the right result is worth of many interpretations, if not.all of them together. Besides, interpretations usually give you more than the truth. What is awkward.
-2vi21maobk9vp12y
Unfortunately, in some cases it is not clear what exactly you should calculate to make a good prediction. Penrose interpretation and MWI can be used to decide - at least sometimes. Nobody has (yet) reached the scale where the difference would be easily testable, though.
0Normal_Anomaly12y
The wikipedia page on the Copenhagen Interpretation says:
4Logos0112y
No, it is exactly as complicated. As demonstrated by its utilization of exactly the same mathematics. It is not without its own extra entities of equally enormously additive nature however; and even and those abnormal conclusions are as valid from the CI as is quantum immortality from MWI. -- I speak as someone who rejects both.
-1JoshuaZ12y
Not all formalizations that give the same observed predictions have the same Kolmogorov complexity, and this is true even for much less rigorous notions of complexity. For example, consider a computer program that when given a positive integer n, outputs the nth prime number. One simple thing it could do is simply use trial division. But another could use some more complicated process, like say brute force searching for a generator of (Z/pZ)*. In this case, the math being used is pretty similar, so the complexity shouldn't be that different. But that's a more subtle and weaker claim.
4dlthomas12y
Is that true? I thought Kolmogorov complexity was "the length of the shortest program that produces the observations" - how can that not be a one place function of the observations?
2JoshuaZ12y
Yes. In so far as the output is larger than the set of observations. Take MWI for example- the output includes all the parts of the wavebranch that we can't see. In contrast, Copenhagen only has outputs that we by and large do see. So the key issue here is that outputs and observable outputs aren't the same thing.
2dlthomas12y
Ah, fair. So in this case, we are imagining a sequence of additional observations (from a privileged position we cannot occupy) to explain.
0Logos0112y
Sure. But MWI and CI use the same formulae. They take the same inputs and produce the same outputs. Everything else is just that -- interpretation. And those would be different calculations. No, it's the same math.
-4JoshuaZ12y
The interpretation in this context can imply unobserved output. See the discussion with dlthomas below. Part of the issue is that the interpretation isn't separate from the math.
-3Logos0112y
"Entities must not be replicated beyond necessity". Both interpretations violate this rule. The only question is which violates it more. And the answer to that seems to one purely of opinion. So throwing out the extra stuff -- they're using exactly the same math.
2Grognor12y
I agree with everything you said here, but I didn't want to use so many words.

Furthermore, I agree with every essay I've ever read by Yvain, I use "believe whatever gwern believes" as a heuristic/algorithm for generating true beliefs, and don't disagree with anything I've ever seen written by Vladimir Nesov, Kaj Sotala, Luke Muelhauser, komponisto, or even Wei Dai;

Wow. One of these is not like the others! (Hint: all but one have karma > 10,000.)

In all seriousness, being placed in that group has to count as one of the greatest honors of my internet life.

So I suppose I can't be totally objective when I sing the praises of this post. Nonetheless, it is a fact that I was planning to voice my agreement well before I reached the passage quoted above. So, let me confirm that I, too, "stand by" the Sequences (excepting various quibbles which are of scant relevance in this context).

I'll go further and note that I am significantly less impressed than most of LW by Holden Karnofsky's critique of SI, and suspect that the extraordinary affective glow being showered upon it is mostly the result of Holden's affiliation with GiveWell. Of course, that affective glow is so luminous (the post is at, what, like 200 now?) that to say I'm less impresse... (read more)

3Grognor12y
There are many others, as well, but a full list seemed like an extremely terrible idea.
4Will_Newsome12y
Though I'd like a post that encouraged people to make such lists in the comments, so I could figure out who the people I like like.
2TheOtherDave12y
You could create such a post. Alternatively, you could PM the people you like and ask them whom they like.
2Grognor12y
Extreme nitpick: "PM the people you like" is not the converse of "create a post".
0TheOtherDave12y
You are entirely correct. Edited.
1khafra12y
If you do PM the people you like and ask them who they like, I'd like to know. The list of people you take seriously is highly correlated with the list of people I take seriously.

Once we get into the habit of openly professing our beliefs for sake of our micropolitics, we are losing a large portion of our bandwith to pure noise. I realise some amount of group allegiance signalling is inevitable always when an opinion is expressed, even when questions are asked, but the recent post by Dmytry together with this post are too explicit in this manner. The latter is at least honest, but I have voted both down nevertheless.

My "ick" sense is being set off pretty heavily by the idea of people publicly professing their faith in a shared set of beliefs, so this post makes me deeply uncomfortable. At best something like this is a self congratulatory applause light which doesn't add anything, at worst it makes us less critical and leads us further towards the dreaded c word.

I disagree. From checking "recent comments" a couple times a day as is my habit, I feel like the past few days have seen an outpouring of criticism of Eliezer and the sequences by a small handful of people who don't appear to have put in the effort to actually read and understand them, and I am thankful to the OP for providing a counterbalance to that.

If we can only tolerate people criticising the sequences, and no one saying they agree with them, then we have a serious problem.

7John_Maxwell12y
It degrades the quality of discussion to profess agreement or disagreement with such a large cluster of ideas simultaneously. Kind of like saying "I'm Republican and the Republican platform is what our country needs", without going into specifics on why the Republican platform is so cool. I think it would have been fine for Grognor to summarize arguments that have been made in the past for more controversial points in the Sequences. But as it stands, this post looks like pure politics to me.
2[anonymous]12y
This is a good feeling. It means we are doing something right. If we are well-calibrated wrt to phyg-risk, you should see large proclamations of agreement about as often as criticisms. Rationalists should study marching in lockstep and agreeing with the group as much as they should practice lonely dissent.

I think this might be the most strongly contrarian post here in a while...

5Normal_Anomaly12y
And look! It's been upvoted! We are neither an Eliezer-following cult nor an Eliezer-hating cult! :)
7shminux12y
Actually, if you apply the metric of #votes/#comments, the low vote for a large number of comments suggests that the post has been both heavily upvoted and heavily downvoted.

I'm pretty sad a post that consists entirely of tribal signaling has been voted up like this. This post is literally asking people to publicly endorse the Sequences because they're somehow under-attack (by whom?!) and no one agrees with them any more. Because some of us think having more smart people who disagree with us would improve Less Wrong? I find glib notions of a "singularity religion" obnoxious but what exactly is the point of asking people here to announce their allegiance to the site founder's collection of blog posts?

This post is literally asking people to publicly endorse the Sequences because they're somehow under-attack (by whom?!) and no one agrees with them any more.

To me it seems more like a spreading attitude of -- "The Sequences are not that much important. Honestly, almost nobody reads all of them; they are just too long (and their value does not correspond to their length). Telling someone to 'read the Sequences' is just a polite way to say 'fuck you'; obviously nobody could mean that literally."

I have read most of the Sequences. In my opinion it is material really worth reading; not only better than 99% of internet, but if it became a book, it would be also better than 99% of books. (Making it a book would be a significant improvement, because there would be an unambiguous ordering of the chapters.) It contains a lot of information, and the information makes sense. And unlike many interesting books, it is not "repeating the basic idea over and over, using different words, adding a few details". It talks about truth, then about mind, then about biases, then about language, then about quantum physics, etc. For me reading the Sequences was very much worth my time,... (read more)

5TheOtherDave12y
FWIW, when I read them, I often found the comment threads valuable as well. Admittedly, that might relate to the fact that I read them chronologically, which allowed me to see how the community's thoughts were evolving and influencing the flow of posts.

There's such a thing as a false dissent effect. (In fact I think it's mentioned in the Sequences, although not by that name. Right now I'm too lazy to go find it.) If only those who disagree with some part or another of the Sequences post on the subject, we will get the apparent effect that "Nobody takes Yudkowsky seriously - not even his own cultists!" This in spite of the fact that everyone here (I assume) agrees with some part of the Sequences, perhaps as much as 80%. Our differences are in which 20% we disagree with. I don't think there's anything wrong with making that clear.

For myself, I agree that MWI seems obvious in retrospect, that morality is defined by the algorithm of human thought (whatever it is), and that building programs whose output affects anything but computer screens without understanding all the details of the output is a Bad Idea.

4prase12y
To counter the effect it would be enough if those who agree with a criticised position supported it by pointing out the errors in the arguments of the critics, or offering better counterarguments themselves. Being silent when a belief is being disputed only to declare allegiance to it and leave the conversation later seems to be the wrong approach.
3Grognor12y
I was waiting for someone to make this accusation. The only thing missing is the part where I only agree with Yudkowsky because he's high status and I wish to affiliate with him!
5John_Maxwell12y
I know that I have observed and corrected a pattern like that in my thinking in the past. Studying biases is useless if you don't adjust your own thinking at times you identify a bias that may be affecting it. I think your prior for the comment you describe being true should be very high, and I'd like to know what additional evidence you had that brought that probability down. You've mentioned in the past that the high status tone of the sequences caught your attention and appealed to you. Isn't it possible that you are flawed? I know I am.
-1Grognor12y
Actually, I think what Jack said (that this post is signaling, but nothing else about his comment) is true, and my reply is also quite possibly true. But I don't know how I should act differently in accordance with the possibility. It's a case of epistemic luck if it is true. (As an aside, I think this is Alicorn's best article on this website by leaps and bounds; I recommend it, &c.) What?
5Jack12y
What accusation? I'm just describing your post and asking questions about it. Edit: I mean, obviously I have a point of view. But it's not like it was much of a stretch to say what I did. I just paraphrased. How is my characterization of your post flawed?

I think mainstream science is too slow and we mere mortals can do better with Bayes.

Then why aren't we doing better already?

The institutional advantages of the current scientific community are so apparent that it seems more feasible to reform it than to supplant it. It's worth thinking about how we could achieve either.

2A1987dM12y
The Hansonian answer would be “science is not about knowledge”, I guess. (I don't think it's that simple, anyway.)
8cousin_it12y
I don't understand your comment, could you explain? Science seems to generate more knowledge than LW-style rationality. Maybe you meant to say "LW-style rationality is not about knowledge"?
5Desrtopa12y
Science has a large scale academic infrastructure to draw on, wherein people can propose research that they want to get done, and those who argue sufficiently persuasively that their research is solid and productive receive money and resources to conduct it. You could make a system that produces more knowledge than modern science just by diverting a substantial portion of the national budget to fund it, so only the people proposing experiments that are too poorly designed to be useful don't get funding. Besides which, improved rationality can't simply replace entire bodies of domain specific knowledge. There are plenty of ways though, in which mainstream science is inefficient at producing knowledge, such as improper use of statistics, publication bias, and biased interpretation of results. There are ways to do better, and most scientists (at least those I've spoken to about it,) acknowledge this, but science is very significantly a social process which individual scientists have neither the power nor the social incentives to change.
6IlyaShpitser12y
"There are plenty of ways though, in which mainstream science is inefficient at producing knowledge, such as improper use of statistics, publication bias, and biased interpretation of results. There are ways to do better, and most scientists (at least those I've spoken to about it,) acknowledge this, but science is very significantly a social process which individual scientists have neither the power nor the social incentives to change." I am an academic. Can you suggest three concrete ways for me to improve my knowledge production, which will not leave me worse off?
2Rhwawn12y
Well...
0Desrtopa12y
Ways for science as an academic institution, or for you personally? For the latter, Luke has already done the work of creating a post on that. For the former, it's more difficult, since a lot of productivity in science requires cooperation with the existing institution. At the least, I would suggest registering your experiments in a public database before conducting them, if any exist within your field, and using Bayesian probability software to process the statistics of your experiments (this will probably not help you at all in getting published, but if you do it in addition to regular significance testing, it should hopefully not inhibit it either.)
0Xachariah12y
My default answer for anything regarding Hanson is 'signaling'. How to fix science is a good start. Science isn't just about getting things wrong or right, but an intricate signaling game. This is why most of what comes out of journals is wrong. Scientists are rewarded for publishing results, right or wrong, so they comb data for correlations which may or may not be relevant. (Statically speaking, if you comb data 20 different ways, you'll get at least 1 thing which shows a statistically significant correlation, just from sheer random chance.) Journals are rewarded for publishing sensational results, and not confirmations or even refutations (especially refutations of things they published in the first place). The rewards system does not set up coming with right answers, but coming up with answers that are sensational and cannot be easily refuted. Being right does make them hard to refute, which is why science is useful at all but that's not the only way things are made hard to refute. An ideal bayesian unconstrained with signaling could completely outdo our current scientific system (as it could do for in all spheres of life). Even shifting our current system to be more bayesian by abandoning the journal system and creating pre-registration of scientific studies would be a huge upgrade. But science isn't about knowledge, knowledge is just a very useful byproduct we get from it.
4CuSithBell12y
But Grognor (not, as this comment read earlier, army1987) said that "we mere mortals can do better with Bayes", not that "an ideal bayesian unconstrained with signaling could completely outdo our current scientific system". Arguing, in response to cousin_it, that scientists are concerned with signalling makes the claim even stronger, and the question more compelling - "why aren't we doing better already?"
1A1987dM12y
I had taken “we” to mean the 21st-century civilization in general rather than just Bayesians, and the question to mean “why is science doing so bad, if it could do much better just by using Bayes”?
3CuSithBell12y
I'm fairly confident that "we" refers to LW / Bayesians, especially given the response to your comment earlier. Unfortunately we've got a bunch of comments addressing a different question, and some providing reasons why we shouldn't expect to be "doing better", all of which strengthen cousin_it's question, as Grognor claims we can. Though naturally Grognor's intended meanings are up for grabs.
1Oscar_Cunningham12y
To which gold standard are you comparing us and Science to determine that we are not doing better?

There's a problem here in that Bayesian reasoning has become quite common in the last 20 years in many sciences, so it isn't clear who "we" should be in this sort of context.

Indeed. For anyone who has worked at all in oil & gas exploration, the LW treatment of Bayesian inference and decision theories as secret superpowers will seem perplexing. Oil companies have been basing billion dollar decisions on these methods for years, maybe decades.

I am also confused about what exactly we are supposed to be doing. If we had the choice of simply becoming ideal Bayesian reasoners then we would do that, but we don't have that option. "Debiasing" is really just "installing a new, imperfect heuristic as a patch for existing and even more imperfect hardware-based heuristics."

I know a lot of scientists - I am a scientist - and I guess if we were capable of choosing to be Bayesian superintelligences we might be progressing a bit faster, but as it stands I think we're doing okay with the cognitive resources at our disposal.

Not to say we shouldn't try to be more rational. It's just that you can't actually decide to be Einstein.

6[anonymous]12y
I think 'being a better Bayesian' isn't about deciding to be Einstein. I think it's about being willing to believe things that aren't 'settled science', where 'settled science' is the replicated and established knowledge of humanity as a whole. See Science Doesn't Trust Your Rationality. The true art is being able to do this without ending up a New Ager, or something. The virtue isn't believing non-settled things. The virtue is being willing to go beyond what science currently believes, if that's where the properly adjusted evidence actually points you. (I say 'beyond' because I mean to refer to scope. If science believes something, you had better believe it - but if science doesn't have a strong opinion about something you have no choice but to use your rationality).

I can't understand this. How could the sequences not be relevant? Half of them were created when Eliezer was thinking about AI problems.

That doesn't mean anything said in them is responsive to Holden's arguments.

4Grognor12y
You're right. That was a non-sequitur. My mistake.

I don't see what this adds beyond making LW more political. Let's discuss ideas, not affiliations!

If you agree with everything Eliezer wrote, you remember him writing about how every cause wants to be a cult. This post looks exactly like the sort of cultish entropy that he advised guarding against to me. Can you imagine a similar post on any run-of-the-mill, non-cultish online forum?

It worries me a lot that you relate ideas so strongly to the people who say them, especially since most of the people you refer to are so high status. Perhaps you could experimentally start using the less wrong anti-kibitzer feature to see if your perception of LW changes?

4Desrtopa12y
But also what he said about swinging too far in the opposite direction. It's bad for a community to reach a point where it's taboo to profess dissent, but also for it to reach a point where it's taboo to profess wholehearted agreement.
5prase12y
Wouldn't it be better if the professed agreement was agreement with ideas rather than with people? The dissent counterpart of this post would say "I disagree with everything what this person says". That's clearly pathological.
4Viliam_Bur12y
This may sound wrong, but "who said that" is a Bayesian evidence, sometimes rather strong one. If your experience tells you that given person is right in 95% of things they say, it is rational to give 95% prior probability to other things they say. It is said that we should judge the ideas by the ideas alone. In Bayes-speak it means that if you update correctly, enough evidence can fix a wrong prior (how much evidence is needed depends on how wrong the prior was). But gathering evidence is costly, and we cannot spend too high cost for every idea around us. Why use a worse prior, if a better one is available? Back to human-speak: if person A is a notorious liar (or a mindkilled person that repeats someone else's lies), person B is careless about their beliefs, and person C examines every idea carefully before telling it to others, then it is rational to react differently to ideas spoken by these three people. The word "everything" is too strong, but saying "if a person C said that, I believe it" is OK (assuming that if there is enough counter-evidence, the person will update: both on the idea and on credibility of C). Disagreeing with everything one says would be trying to reverse stupidity. There are people who do "worse than random", so doing an opposite of what they said could be a good heuristic; but even for most of them assigning 95% probability that they are wrong would be too much.
3prase12y
You are right, but there is probably some misunderstanding. That the personal considerations should be ignored when assessing probability of an idea, and that one shouldn't express collective agreement with ideas based on their author, are different suggestions. You argue against the former while I was stating the latter. It's important to take into account the context. When an idea X is being questioned, saying "I agree with X, because a very trustworthy person Y agrees with X" is fine with me, although it isn't the best sort of argument one could provide. Starting the discussion "I agree with X1, X2, X3, ... Xn", on the other hand, makes any reasonable debate almost impossible, since it is not practical to argue n distinct ideas at once.
0Desrtopa12y
Well, the professed agreement of this post is with the Sequences, which are a set of ideas rather than a person, even if they were all written by one person. The dissent counterpart of this post would be "I disagree with the entire content of the sequences." Am I misunderstanding you about something?
-2prase12y
Professing agreement or disagreement with a broad set of rather unrelated ideas is not conductive to productive discussion because there is no single topic to concentrate on with object-level arguments. Having the set of ideas defined by their author brings in tribal political instincts, which too is not helpful. You are right that the post was formulated as agreement with the Sequences rather than with everything which Yudkowsky ever said but I don't see how this distinction is important. "Everything which Yudkowsky ever said" would also denote a set of ideas, after all.
4TheOtherDave12y
Albeit an internally inconsistent set, given that Yudkowsky has occasionally changed his mind about things.
0Grognor12y
Well, that's either an exaggeration or an audacious lie. In any case, I had neither the means nor the desire to list every idea of theirs I agree with.
3TheOtherDave12y
"I think the Sequences are right about everything" is a pro-Sequences position of roughly the same extremity as "I think the Sequences are wrong about everything." As far as I know, nobody claims the latter, and it seems kind of absurd on the face of it. The closest I've seen anyone come to it is something like "everything true in the Sequences is unoriginal, everything original in them is false, and they are badly written"... which is a pretty damning criticism, but still allows for the idea that quite a lot of what the text expresses is true. Its equally extreme counterpart on the positive axis should, then, allow for the idea that quite a lot of what the text expresses is false. To reject overly-extreme positive statements more strongly than less-extreme negative ones is not necessarily expressing a rejection of positivity; it might be a rejection of extremism instead.
4Desrtopa12y
I wouldn't say that they're comparably extreme. A whole lot of the content of the sequences is completely uncontroversial, and even if you think Eliezer is a total lunatic, it's unlikely that they wouldn't contain any substantively true claims. I'd bet heavily against "The Biggest Secret" by David Icke containing no substantively true claims at all. I would have some points of disagreement with someone who thinks that everything in the sequences is correct (personally, I doubt that MWI is a slam dunk, because I'm not convinced Eliezer is accurately framing the implications of collapse, and I think CEV is probably a dead end when it comes to programming an FAI, although I don't know of any team which I think has better odds of developing an FAI than the SIAI.) But I think someone who agrees with the entirety of the contents is being far more reasonable than someone who disagrees with the entirety.
2TheOtherDave12y
I agree that "I think the Sequences are wrong about everything" would be an absurdly extreme claim, for the reasons you point out. We disagree about the extremity of "I think the Sequences are right about everything". I'm not sure where to go from there, though.
4Kindly12y
Suppose you separate the Sequences into "original" and "unoriginal". The "unoriginal" segment is very likely to be true: agreeing with all of it is fairly straightforward, and disagreeing with all of it is ridiculously extreme. To a first approximation, we can say that the middle-ground stance on any given point in the "original" statement is uncertainty. That is, accepting that point and rejecting it are equally extreme. If we use the general population for reference, of course, that is nowhere near correct: even considering the possibility that cryonics might work is a fairly extreme stance, for instance. But taking the approximation at face value tells us that agreeing with every "original" claim, and disagreeing with every "original" claim, are equally extreme positions. If we now add the further stipulation that both positions agree with every "unoriginal" claim, they both move slightly toward the Sequences, but not by much. So actually (1) "I agree with everything in the sequences" and (2) "Everything true in the Sequences is unoriginal, everything original in them is false" are roughly equally extreme. If anything, we have made an error in favor of (1). On the other hand, (3) "Everything in the Sequences ever is false" is much more extreme because it also rejects the "unoriginal" claims, each of which is almost certainly true. P.S. If you are like me, you are wondering about what "extreme" means now. To be extremely technical (ha) I am interpreting it as measuring the probability of a position re: Sequences that you expect a reasonable, boundedly-rational person to have. For instance, a post that says "Confirmation bias is a thing" is un-controversial, and you expect that reasonable people will believe it with probability close to 1. A post that says "MWI is obviously true" is controversial, and if you are generous you will say that there is a probability of 0.5 that someone will agree with it. This might be higher or lower for other posts in the "origin
0TheOtherDave12y
Yeah, I think I agree with everything here as far as it goes, though I haven't looked at it carefully. I'm not sure originality is as crisp a concept as you want it to be, but I can imagine us both coming up with a list of propositions that we believe captures everything in the Sequences that some reasonable person somewhere might conceivably disagree with, weighted by how reasonable we think a person could be and still disagree with that proposition, and that we'd end up with very similar lists (perhaps with fairly different weights). .
2John_Maxwell12y
I just read that essay and I disagree with it. Stating one's points of disagreement amounts to giving the diffs between your mind and that of an author. What's good practice for scientific papers (in terms of remaining dispassionate) is probably good practice in general. The way to solve the cooperation problem is not to cancel out professing disagreement with professing agreement, it's to track group members' beliefs (e.g. by polling them) and act as a group on whatever the group consensus happens to be. In other words, teach people the value of majoritarianism and its ilk and tell them to use this outside view when making decisions.
6Desrtopa12y
In terms of epistemic rationality, you can get by fine by raising only points of disagreement and keeping it implicit that you accept everything you do not dispute. But in terms of creating effective group cooperation, which has instrumental value, this strategy performs poorly.
3Grognor12y
Oho! But it is not. You know, the nervousness associated with wanting to not be part of a cult is also a cult attractor. Once again I must point out that you are conveying only connotations, not denotations. No slogans! I tried this for a few weeks; it didn't change anything.
6John_Maxwell12y
That sounds wrong to me. I'm more motivated by making Less Wrong a good place to discuss ideas than any kind of nervousness.
4Viliam_Bur12y
Meta-ideas are ideas too, for example: An idea: "Many-Worlds Interpretation of quantum physics is correct, because it's mathematically correct and simplest according to Occam's Razor (if Occam's Razor means selecting the interpretation with greatest Solomonoff Prior)." -- agree or disagree. A meta-idea: "Here is this smart guy called Eliezer. He wrote a series of articles about Occam's razor, Solomonoff prior, and quantum physics; those articles are relatively easy to read for a layman, and they also explain frequently made mistakes when discussing these topic. Reading those articles before you start discussing your opinions (with high probability repeating the frequently made mistakes explained in those articles) is a good idea to make the conversation efficient." -- agree or disagree. This topic is about the meta-idea.
3John_Maxwell12y
Yes, but that was not what Grognor wrote... He professed a bunch of beliefs and complained that he felt in the minority for having them. He even explicitly discouraged discussing individual beliefs: In other words, he prefers to discuss high-level concerns like whether you are with him or against him over low-level nuts-and-bolts details. Edit: I see that Grognor has added a statement of regret at the end of his post. I'm willing to give him some of his credibility back.
6Grognor12y
I don't like your tone. Anyway, this is wrong; I suspected I was part of a silent majority. Judging by the voting patterns (every comment indicating disagreement is above 15, every comment indicating agreement is below 15 and half are even negative) and the replies themselves, I was wrong and the silence is because this actually is a minority position. No! Just in this thread! There are all the other threads on the entire website to debate at the object-level. I am tempted to say that fifteen more times, if you do not believe it. O frabjuous day, JMIV does not consider me to be completely ridiculous anymore. Could you be more patronizing? Edit, in response to reply: In retrospect, a poll would have been better than what I ended up doing. But doing nothing would have been better still. At least we agree on that.
6John_Maxwell12y
Hm. Perhaps you could have created an anonymous poll if you wished to measure opinions? Anonymity means people are less likely to form affiliations. Just one thread devoted to politics is probably okay, but I would prefer zero.
4JoshuaZ12y
This may not indicate what you think it indicates. In particular, I (and I suspect other people) try to vote up comments that make interesting points even if we disagree with them. In this context, some upvoting may be due to the interestingness of the remarks which in some contexts is inversely correlated with agreement. I don't think that this accounts for the entire disparity, but it does likely count for some of it. This, combined with a deliberate desire to be non-cultish in voting patterns may account for much of the difference.
0Grognor12y
See Cultish Countercultishness, unless of course you disagree with that too.

I stand by the sequences too. Except I'm agnostic about MWI, not because it makes no new predictions or another silly reason like that, but because I'm not smart and conscientious enough to read that particular sequence + a textbook about QM. And unlike you I'm sure that Eliezer's answer to the metaethics problem is correct, to the point that I can't imagine how it could be otherwise.

It's true that one possible reason why good contrarians are hard to find is that the group is starting to be cult-like, but another such reason is that the contrarians are just wrong.

I think the Sequences got everything right and I agree with them completely.

Even Eliezer considers them first drafts (which is, after all, what they were: a two-year exercise in writing raw material for the forthcoming book), not things to be treated as received wisdom.

But, hey, maybe he's wrong about that.

I think it was cool of Grognor to make a meta+ contrarian post like this one, and it's a good reminder that our kind have trouble expressing assent. As for me, I see some of the sequence posts I see as lies-to-children, but that's different from disagreeing.

But apparently we can hack it by expressing dissent of dissent.

-3faul_sname12y
How many levels of meta can we go?
-2Rain12y
Too many.
3Dorikka12y
Could you elaborate on this? I interpret 'lies-to-children' as meaning that a model is too basic and is wrong in some places because there are applicable details which it does not take into account -- would you not disagree with such things, if you don't think that such models actually form a correct map of reality?
5[anonymous]12y
The entire Quantum Physics sequence is a protracted lie to people who can't or don't want to do the math. This was explicit in the introduction to the sequence.

Well, I see it more like teaching someone about addition, and only covering the whole numbers, with an offhand mention that there are more numbers out there that can be added.

Drastic simplification, yes. Lie, no.

4khafra12y
I meant in the sense of lies-to-children, or Wittgenstein's Ladder. I cannot remember the primary posts that gave me that impression, and I know they were better than this concrete example; but that shows what I was talking about. At a sufficiently granular layer, technically incorrect; but inarguably useful.

My opinions:

  • MWI seems obvious in hindsight.

  • Cryonics today has epsilon probability of success, maybe it is not worth its costs. If we disregard the costs, it is a good idea. (We could still subscribe to cryonics as an altruist act -- even if our chances of being successfuly revived are epsilon, our contributions and example might support development of cryonics, and the next generations may have chances higher than epsilon.)

  • Mainstream science is slow, but I doubt people will generally be able to do better with Bayes. Pressures to publish, dishonesty, congnitive biases, politics etc. will make people choose wrong priors, filter evidence, etc. and then use Bayes to support their bottom line. But it could be a good idea for a group of x-rationalists to use scientific results and improve them by Bayes.

  • I think our intuitions about morality don't scale well. Above some scale, it is all far mode, so I am not even sure there is a right answer. I think consequentialism is right, but computing all the consequences of a single act is almost impossible, so we have to use heuristics.

  • People in general are crazy, that's sure. But maybe rationality does not have enough benefit to be bette

... (read more)
[-][anonymous]12y140

This is only tangentially related, but:

Having Eliezer's large corpus of posts as the foundation for LW may have been helpful back in 2009, but at this point it's starting to become a hindrance. One problem is that the content is not being updated with the (often correct) critiques of Sequence posts that are continually being made in the comments and in the comments of the Sequence reruns. As a result, newcomers often bring up previously-mentioned arguments, which in turn get downvoted because of LW's policy against rehashing discussions. Additionally, the fact that important posts by other authors aren't added to the Sequences is part of what gives people the (incorrect) impression that the community revolves around Eliezer Yudkowsky. Having a static body of knowledge as introductory material also makes it look like the community consensus is tied to Eliezer's beliefs circa 2007-2009, which also promotes "phyg"-ish misconceptions about LW.

An alternative would be for LW to expect newcomers to read a variety of LW posts that the community thinks are important rather than just the Sequences. This would show newcomers the diversity of opinion on LW and allow the community's introductory material to be dynamic rather than static.

4Normal_Anomaly12y
As a matter of fact, the "Sequences" page contains the following as about 1/4 to 1/3 of its table of contents.
6[anonymous]12y
Yes, but newcomers aren't expected/instructed to read those. They are expected to be familiar with the Core/Major Sequences, which are all by Eliezer Yudkowsky and are not incrementally updated. Another way of putting it: When an LWer tells a newbie to "read the Sequences," that list is not the Sequences they are talking about.
4Viliam_Bur12y
The Sequences should be reorganized. It would be rational to admit that most new readers will not read all the articles (at least not now; later they may change opinion), so we could have a short list, a medium list, and a long list of articles. And even the short list should contain the best articles written by other authors. Well, it's a wiki, anyone can edit it. Also anyone can make their suggested list of "Short Sequences" and put it to discussion for comments.
2Eugine_Nier12y
There are also many useful posts by others that aren't part of any sequence.
9steven046112y
Yes, and adding posts to the official canon only if they're presented as part of a sequence creates a perverse incentive to write long-windedly.

"I think mainstream science is too slow and we mere mortals can do better with Bayes."

I never understood this particular article of LW faith. It reminds me of that old saying "It's too bad all the people who know how to run the country are too busy driving taxicabs and cutting hair."

I agree that there is quite a bit of useful material in stuff Eliezar wrote.

I reject MWI, reject consequentialism/utilitarianism, reject reductionism, reject computationalism, reject Eliezer's metaethics. There's probably more. I think most of the core sequences are wrong/wrongheaded. Large parts of it trade in nonsense.

I appreciate the scope of Eliezer's ambition though, and enjoy Less Wrong.

Can you explain all that to someone who largely agrees with EY?

reject reductionism,

This one I'm curious to hear about. Of everything on that list, this is generally pretty uncontroversial.

-14shminux12y
[-][anonymous]12y120

"There's been a lot of talk here lately about how we need better contrarians. I don't agree. I think the Sequences got everything right and I agree with them completely." But presumably theres room for you to be incorrect? Surely good contrarians would help clarify your views?

Also, when people talk about contrarians they talk of specific conclusions present in the sequences and in less wrong in general- MWI, cryonics, specific ethical conclusions, and the necessity of FAI. I suspect most visitors to this website would agree with much of the sequences- the best posts (to my mind) are clearly expressed arguments for rational thinking.

You are stating 7 points. It's at least 128 different world views from there, depending on what one agrees.

I am number 25 school member, since I agree with the last and two more.

I doubt, that there is a lot of school number 127 members here, you may be the only one.

(This isn't addressed at you, Thomas.)

For those who might not understand, Thomas is treating agreement-or-not on each bullet point as a 1 or 0, and stringing them together as a binary number to create a bitmask.

(I'm using the 0b prefix to represent a number written in its binary format.)

This means that 127 = 0b1111111 corresponds to agreeing with all seven bullet points, and 25 = 0b0011001 corresponds to agreeing with only the 3rd, 4th and 7th bullet points.

(Note that the binary number is read from left-to-right in this case, so bullet point 1 corresponds to the "most-significant" (left-most) bit.)

2Thomas12y
Excellently explained. Now, do we have 127?
2drnickbone12y
Anyone else voting for 0? Most of the opinions in the list sound so whacky that 0 is likely the default position of someone outside Less Wrong. I've been here a few months, and read most of the Sequences, but none of the bits in my own bitmap has flipped. Sorry Eliezer! The odd thing is that I find myself understanding almost exactly why Eliezer holds these opinions, and the perfectly lucid reasoning leading to them, and yet I still don't agree with them. A number of them are opinions I'd already considered myself or held myself at some point, but then later rejected. Or I hold a rather more nuanced or agnostic position than I used to.
0Thomas12y
What is the number of the most relevant points from the Sequences? The Grognor's selection of those 7 may not be the best. Let me try: BitNumber Statement * 0 Intelligence explosion is likely in a (near) future * 1 FOOM is possible to occur * 2 Total reductionism * 3 Bayesism is greater than science * 4 Action to save the world is a must * 5 No (near) aliens * 6 FAI or die * 7 CEV is the way to go * 8 MWI * 9 Evolution is stupid and slow Now, I agree with those from 0 to 5 (first six) in this list I've select. The binary number wold be "111111" or 63 in the decimal notation. They were not new to me, all 10 of them. Yudkowsky's fiction is just great, BTW. The "Three world collide" may the the best story I have ever read.
4Grognor12y
I'd like to point out that CEV is not in the sequences, and it looks mostly like a starting point idea from which to springboard to the "true" way to build an FAI.
-2Thomas12y
I don't care, if the teaching is divided between Sequences and elsewhere.
1prase12y
If I intended to encode my beliefs (which I don't), I couldn't, because I don't: * know what's the precise difference between 0 and 1 * understand 2 - what's total reductionism, especially in contrast to ordinary reductionism * see any novel insight in 9, which leads me to suspect I am missing the point
4khafra12y
Cryonics is good, and bayes is better than science? An agreement bitmask is a fun perspective; I dunno why you got downvoted.
2Thomas12y
Bayes is better than science, yes. but it's not the cryonics that I like but As dbaupp explained.
4khafra12y
Whoops. I wasn't counting the sub-bullet as a power-of-two position; gotcha. FWIW, I still think the agreement bitmask is a fun perspective, even though I got it wrong (and there's the whole big-endian/little-endian question).
1dlthomas12y
That's cleared up by:

I largely agree with the Sequences, and I also don't care for "low-level rehashes of tired debates", but I am wary of dismissing all disagreement as "low-level rehashes of tired debates".

I think people participating on LW should be familiar with the arguments presented in the Sequences, and if they disagree, they should demonstrate that they disagree despite knowing those arguments. When people fail to do this, we should point it out, and people who repeatedly fail to do this should not be taken seriously.

0fortyeridania12y
I too agree with pretty much everything in the Sequences that I've read (which is nearly all except the quantum bits), and I share your antipathy toward summary dismissal of disagreement. I was especiallyoffended at OP's request that commenters refrain from "substantiating" their objections.
3TheOtherDave12y
So, I agree that a request like that regarding some idea X contributes to the silence of those who disagree with X. That said, if there are a hundred people, and each person is (on average) ten times more likely to devote a unit of effort to talking about why X is wrong than to talking about why X is right even in situations where they are unsure whether X is wrong or right, then without some request along those lines there will effectively never be a discussion of why X is right. That isn't necessarily a problem; maybe we're OK with never having a discussion of why X is right. But it does mean that "summary dismissal" isn't a unilateral thing in cases like that. Yes, in such a case, if I make such a request, I am contributing to the silencing of the anti-X side as above... but if I fail to make such a request, I am contributing to the silencing of the pro-X side (though of course I can't be held accountable for it, since the responsibility for that silencing is distributed). I try to stay aware that the end result sometimes matters more than whether anyone can hold me accountable for it.
0fortyeridania12y
This is true. Good point. Also true.

Thank you for saying this, Grognor! As you say, being willing to come out and say such is an important antidote to that phyg-ish nervous expression.

That is pretty much my view as well. The only substantial disagreements I have with Eliezer are the imminence of AGI (I think it's not imminent at all) and the concept of a "Bayesian" superintelligence (Bayesian reasoning being nothing more than the latest model of thinking to be taken as being the key to the whole thing, the latest in a long line of fizzles).

I think criticism of the OP on the ground of conjunction improbability are unfounded. The components are not independent, and no-one, including the OP, is saying it is all correct in every d... (read more)

I regard the Sequences as a huge great slab of pretty decent popular philosophy. They don't really tread much ground that isn't covered in modern philosophy somewhere: for me personally, I don't think the sequences influenced my thinking much on anything except MWI and the import of Bayes' theorem; I had already independently come to/found most of the ideas Eliezer puts forward. But then again, I hadn't ever studied philosophy of physics or probability before, so I don't know whether the same would have happened in those areas as well.

The novel stuff in th... (read more)

2thomblake12y
But there are so many, and so little time! I also found most of Eliezer's posts pretty obvious, also having read towering stacks of philosophy before-hand. But the signal-to-noise ratio is much higher for Eliezer than for most philosophers, or for philosophy in general.
[-][anonymous]12y60

I agree with everything in this article, except for one thing where I am undecided:

Furthermore, I agree with every essay I've ever read by Yvain, I use "believe whatever gwern believes" as a heuristic/algorithm for generating true beliefs, and don't disagree with anything I've ever seen written by Vladimir Nesov, Kaj Sotala, Luke Muelhauser, komponisto, or even Wei Dai; policy debates should not appear one-sided, so it's good that they don't.

But that is only because I have yet to read anything big not by EY on this site.

I think the sequences are amazing, entertaining, helpful and accurate. The only downside is that EY's writing style can seem condescending to some.

[-][anonymous]12y60

Yes.

I am also in this boat. Well said.

(or I simply do not know enough physics to know why Eliezer is wrong, which I think is pretty unlikely but not totally discountable).

How much physics do you know?

I tentatively accept Eliezer's metaethics, considering how unlikely it is that there will be a better one (maybe morality is in the gluons?)

What about this thread?

4Grognor12y
You are attacking my weak points, which is right and proper. I've taken introductory college physics and read the quantum bits in Motion Mountain and watched some Khan Academy videos and read the Everett FAQ and this page and the answer to your question is "not very much, really". The reason I think it's unlikely that I don't know enough physics to know whether Eliezer is right is because I don't think Eliezer doesn't know enough physics to know whether he is right. Scott Aaronson wiggled his eyebrows at that possibility without asserting it, which is why I have it under consideration. I don't see an alternative to Yudkowsky's metaethics in there, except for error theory, which looks to me more like an assertion, "there is no metaethics" (Edit: More like no correct metaethics) than a metaethics. I'm fairly ignorant about this, though, which is another reason for my somewhat wide confidence interval.
0bryjnar12y
Nitpick: error theory is more like saying "there is no ethics". In that sense it's a first-order claim; the metaethical part is where you claim that ethical terms actually do have some problematic meaning. As for alternative metaethics; have a read of this if you're interested in the kind of background from which Richard is arguing.

I agree. I think that the reason why it seems as though people who think that the sequences are awesome don't exist is partially because of selection bias -- those that disagree with the sequences are most likely to comment about disagreeing, thus lending disproportionate weight on people who disagree with certain parts of the sequences.

I found the sequences to be a life changing piece of literature, and I usually consider myself fairly well read, especially in regards to the general population. The sequences changed my worldview, and forced me to reevalua... (read more)

5JoshuaZ12y
In practice scientists often look at multiple hypotheses at once and consider them with different likelyhoods.. Moreover, the dangers of human judgement exist whether or not one feels one is "certain" or is more willing to admit uncertainty.
0Thomas12y
You are asking too much. (Here I am defending those Sequences against you. Me who doesn't agree with them, against you who side with them. Funny, isn't it?)
4Tuxedage12y
Surer* Science cannot be absolutely omniscient. However, it should attempt to be as consistent as possible until the point where it sacrifices consistency for truth. At least, that's my take on it.

I write this because I'm feeling more and more lonely, in this regard. If you also stand by the sequences, feel free to say that.

Most of your post describes my position as well, i.e. I consider the sequences (plus a lot of posts by Yvain, Lukeprog etc.) to be closer to truth than what I could work out by myself; though that only applies to stuff I'm sure I understand (i.e. not the Metaethics and QM).

I don't technically believe the literal word of the all sequences, but I to follow your general policy that if you find yourself disagreeing with someone smarter than you, you should just belive what they do, that this community way overvalues contrarians, and that the sequences is mostly right about everything that matters.

2[anonymous]12y
I agree, but I often find it hard to quantify peoples insight into specific areas, not knowing how much my judgment might be halo effect, and if I don't understand a problem properly I have a hard time telling how much of what the person is saying is correct/coherent. Mastery in knowing when to rely on others understanding, would indeed be an invaluable skill. Edit: I just realized a prediction markets would give you that, if you would track statistics of users.

To me personally, the most useful part of the Sequences is learning to ask the question "How does feel from the inside?" This dissolves a whole whack of questions, like the proverbial [lack of ]free will, reality of belief and many other contentious issues. This ties into the essential skill of constructing a mental model of the person asking a question or advancing an argument. I find that most participants on this forum fail miserably in this regard (I am no exception). I wonder if this is on the mini-camp's agenda?

(In the spirit of making forceful declarations of stances.)

I am a utilitarian consequentialist

Is this prescribed by the sequences? If so (and, I suppose, if not) I wholeheartedly reject it. All utilitarian value systems are both crazy and abhorrent to me.

and think that if allow someone to die through inaction, you're just as culpable as a murderer.

Your judgement is both distasteful to me as a way of evaluating the desirability of aspects of the universal wave function and highly suspect as a way of interacting practically with reality. Further, th... (read more)

2MarkusRamikin12y
Would you be willing to expand on this? (Or have you done so elsewhere that I could read?)

Since you agree with all of Eliezer's posts, I recommend that you reread this post right now.

I mostly agree with this, with one reservation about the "mainstream science is too slow" argument.

If we understood as much about a field as scientists, then we could do better using Bayes. But, I think it's very rare that a layman could out-predict a scientist just using Bayes without studying the field to the point where they could write a paper in that field.

I think mainstream science is too slow and we mere mortals can do better with Bayes.

I don't understand what this means at all. Who are these "mere mortals" ? What are they doing, exactly, and how can they do it "better with Bayes" ? If mainstream science is too slow, what will you replace it with ?

9JGWeissman12y
I don't see any available path to replacing the current system of mainstream science as performed by trained scientists with a system of bayesian science peformed by trained bayesian masters. However, comparing mainstream science to normative bayesian epistemology suggests modifications to mainstream science worth advocating for. These include reporting likelihood ratios instead of p-values, pre-registering experiments, and giving replication attempts similar prominence to the novel results they attempt to replicate. For any of these changes, the "mere mortals" doing the science will be the same trained scientist who now perform mainstream science. It is important to remember that though there is much that Bayes says science is doing wrong, there is also much that Bayes says science is doing right.
8JoshuaZ12y
It is possibly noteworthy that all of these are fairly mainstream positions about what sort of changes should occur. Pre-registration is already practiced in some subfields.
0JGWeissman12y
I suspect that pre-registering experiments and giving replication attempts similar prominence to the novel results they attempt to replicate are things that scientists who take "traditional rationality" seriously would advocate for and lament certain social structures that prevent science from being it's ideal form. But is use of likelihood ratios instead of p-values also fairly mainstream?
2JoshuaZ12y
Not as mainstream, but it has been discussed, particularly in the context that likelihood ratios are easier to intuitively understand what they mean. I don't have a reference off-hand, but most versions of the argument I've seen advocate including both.
0Grognor12y
See http://lesswrong.com/lw/1gc/frequentist_statistics_are_frequently_subjective/ and http://lesswrong.com/lw/ajj/how_to_fix_science/
0Bugmaster12y
These are good articles, and I agree with most of what they say. However, if your thesis is that "mainstream science works too slowly, let's throw it all out and replace it with the Bayes Rule", then your plan is deeply flawed. There's more to science than just p-values.

Uh... This "phyg" thing has not died yet?

I wonder if anyone actually thinks this is clever or something. I mean, inventing ways to communicate that are obscure to the public is one of the signs of a cult. And acting like you have something to hide just makes people think you do. So while this might give us a smaller signature on Google's radar, I cringe to think of actual human newcomers to the site, and their impression.

Also, a simple call to sanity: the blog owner himself has written posts with cult in the very title. I just put "cult"... (read more)

0wedrifid12y
Disagree and strongly oppose your influence. Obfuscating an unnecessary google keyword is not the sign of a cult. That word has an actual real world meaning and it evidently isn't the one you seem to think it is. It means this. I hope your needless defection is suppressed beyond the threshold for visibility. Mind you, I've never used the term "phyg" (well, except in the quote here) and don't plan to. Putting it as an actual tag is ridiculous. If you don't want to associate with a concept that is already irrelevant just ignore it. Ironically, this means that only part of your comment that I can agree with is initial "Uh... This 'phyg' thing has not died yet?" The justification of the aversion is muddled thinking.
4MarkusRamikin12y
What's your unmuddled justification? (not sarcasm, interested what you think)
0wedrifid12y
If you'll pardon my literal (and so crude) description of my reaction, "Ick. That word sucks dried monkey balls. You guys sound like total dorks if you say that all the time." ie. I probably share approximately your sentiment just sans any "that means you're cultish" sentiment.
2MarkusRamikin12y
Pardoned. ;) Sorry, I didn't mean "that means you're cultish", more like "that will look cultish/fishy to some newcomers". (And also "I find it embarassingly silly").

I stand by the sequences. When I first found them, I knew that they were/are the single most comprehensive treatment of epistemic thinking that I was going to encounter anywhere.

-1Grognor12y
I'm sorry you've gotten downvoted. I upvoted your comment, and it sits at -1, which means at least two people downvoted it. I guess this sort of comment sets off "ick" or "useless" filters, like the article itself does.

Dropping out of lurk-mode to express support and agreement with your tone and aim, though not necessarily all of your points separately.

ETA: Argh, that came out clunky. Hopefully you grok the intent. Regardless; I know to whom I owe much of my hope, kindness and courage.

0Dorikka12y
Emphasis mine. Isn't this completely orthogonal to the topic at hand?
1Leonhart12y
You're absolutely right; I was seduced by sentence rhythm. Edited to remove.

I agree with your first 2 bullet points. I agree with the third, with the caveat that doing so leads to a greater risk of error. I'm a consequentialist with utilitarianism as a subset of my values. I think "culpable" refers to how society should treat people, and treating people who fail to save lives as murderers is infeasible and unproductive. I choose TORTURE over SPECKS in the relevant thought experiment, if we stipulate that there are 3^^^3 distinct possible people who can be specked, which in reality there aren't. I want to sign up for cry... (read more)

I enjoyed reading (most of) the Sequences, but I'm stunned by this level of agreement. I doubt even Eliezer agrees with them completely. My father always told me that if I agreed with him on everything, at least one of us was an idiot. It's not that I have a sense of "ick" at your public profession of faith, but it makes me feel that you were too easily persuaded. Unless, of course, you already agreed with the sequences before reading them.

Like Jayson_Virissimo, I lean towards the Sequences. Parts I reject include MWI and consequentialism. In som... (read more)

1Desrtopa12y
Could you explain why?
-2Salemicus12y
EY convinced me that consciousness is causally active within the physical universe, and I have yet to find any good argument against the notion - just equivocations about the word "meaning." At the same time, I do accept the argument that no amount of third-person description sums to first-person experience. Hence, substance dualism. I am aware that this is not a full response, but I don't want to sidetrack the thread with an off-topic conversation.

My sense is that the attitude presented in the article and in Yvain's linked comment is problematic in somewhat the same way as asexual reproduction is a problematic as an evolutionary strategy.

Agreeing with something so big (but coherent inside itself) wholesale is a normal reaction if you consume it over small time and it doesn't contain anything you feel is a red flag. It can persist for a long time, but it can easily be either an actual agreement or an agreement "with an object" without integrating and cross-linking the ideas into your bigger map.

Fortunately, it looks like "taboo" technique is quite effective on a deeper level, too. After avoiding using terms, notions, arguments from some body of knowledge in your thinking... (read more)

Any belief that I can be argued into by a blog post I can also probably be argued out of by a blog post.

That sounds like you're saying that what beliefs you can be argued into is uncorrelated with what's true. I don't think you mean that.

1JoshuaZ12y
That doesn't necessarily follow from RomeoSteven's remark. It may take more subtle or careful arguing to persuade someone into a belief that is false. So in practice, what beliefs one can be argued into will be correlated with being true.
1Normal_Anomaly12y
Oh, so you think RomeoSteven was allowing for varying argument skill in the blog posts in question. That makes sense. Anything I can be argued into by a blog post, I can be argued out of by a blog post, if Omega takes up contrarian blogging. :)

"This of course makes me a deranged, non-thinking, Eliezer-worshiping fanatic for whom the singularity is a substitute religion."

Yes, it does. Seek professional help.

[+][anonymous]12y-70