and don't disagree with anything I've ever seen written by Vladimir Nesov, Kaj Sotala, Luke Muelhauser, komponisto, or even Wei Da
This confuses me since these people are not in agreement on some issues.
This confuses me as well, especially since I was a major contributor to "talk here lately about how we need better contrarians" which the OP specifically disagreed with.
Where you and those other fellows disagree are typically on policy questions, where I tend to not have any strong opinions at all. (Thus, "don't disagree".) If you will point to a specific example where you and one of those other fellows explicitly disagree on a factual question (or metaethical question if you don't consider that a subset of factual questions) I will edit my comment.
Addendum: I agree more with you than Eliezer about what to do, especially re: Some Thoughts and Wanted: Backup Plans.
There's a comparison in the back of my mind that may be pattern matching to religious behavior too closely: when one talks to some Orthodox Jews, some of them will claim that they believe everything that some set of major historical rabbis said (say Maimonides and Rashi) when one can easily point to issues where those rabbis disagreed. Moreover, they often use examples like Maimonides or Ibn Ezra who had positions that are in fact considered outright heretical by much of Orthodox Judaism today. I've seen a similar result with Catholics and their theologians. In both cases, the more educated members of the religion seem less inclined to do so, but even the educated members sometimes make such claims albeit with interesting doublethink to justify how the positions really aren't contradictory.
It may be in those cases that what may be going on is that saying "I agree with this list of people and have never seen them as wrong" is a statement of tribal affiliation by saying one agrees with various high status people in the tribe. It is possible that something similar is happening in this context. Alternatively, it may just indicate that Grognor hasn't read that much by you or by some of the other people on the list.
"Think for yourself" sounds vaguely reasonable only because of the abominable incompetence of those tasked with thinking for us.
-- Steven Kaas
I write this because I'm feeling more and more lonely, in this regard. If you also stand by the sequences, feel free to say that. If you don't, feel free to say that too, but please don't substantiate it. I don't want this thread to be a low-level rehash of tired debates, though it will surely have some of that in spite of my sincerest wishes.
Why must we stand-by or stand-away-from? Personally, I lean towards the Sequences. Do you really need to feel lonely unless others affirm every single doctrine?
I think the many-worlds interpretation of quantum mechanics is the closest to correct and you're dreaming if you think the true answer will have no splitting (or I simply do not know enough physics to know why Eliezer is wrong, which I think is pretty unlikely but not totally discountable).
I accept the MWI of QM as "empirically adequate"; no more, no less.
I think cryonics is a swell idea and an obvious thing to sign up for if you value staying alive and have enough money and can tolerate the social costs.
Cryonics is interesting and worth considering, but the probabilities invovled are so low that it is not at all obvious it is a net win after factoring in signallin...
You seem to mostly disagree in spirit with all Grognor's points but the last, though on that point you didn't share your impression of the H&B literature.
I'll chime in and say that at some point about two years ago I would have more or less agreed with all six points. These days I disagree in spirit with all six points and with the approach to rationality that they represent. I've learned a lot in the meantime, and various people, including Anna Salamon, have said that I seem like I've gained fifteen or twenty IQ points. I've read all of Eliezer's posts maybe three times over and I've read many of the cited papers and a few books, so my disagreement likely doesn't stem from not having sufficiently appreciated Eliezer's sundry cases. Many times when I studied the issues myself and looked at a broader set of opinions in the literature, or looked for justifications of the unstated assumptions I found, I came away feeling stupid for having been confident of Eliezer's position: often Eliezer had very much overstated the case for his positions, and very much ignored or fought straw men of alternative positions.
His arguments and their distorted echoes lead one to think that various p...
Disagree, utilitarianism is retarded.
When making claims like this, you need to do something to distinguish yourself from most people who make such claims, who tend to harbor basic misunderstandings, such as an assumption that preference utilitarianism is the only utilitarianism.
Utilitarianism has a number of different features, and a helpful comment would spell out which of the features, specifically, is retarded. Is it retarded to attach value to people's welfare? Is it retarded to quantify people's welfare? Is it retarded to add people's welfare linearly once quantified? Is it retarded to assume that the value of structures containing more than one person depends on no features other than the welfare of those persons? And so on.
he doesn't address the most credible alternatives to MWI
I don't think you need to explicitly address the alternatives to MWI to decide in favor of MWI. You can simply note that all interpretations of quantum mechanics either 1) fail to specify which worlds exist, 2) specify which worlds exist but do so through a burdensomely detailed mechanism, 3) admit that all the worlds exist, noting that worlds splitting via decoherence is implied by the rest of the physics. Am I missing something?
I think the Sequences got everything right
That is quite a bit of conjunction you've got going on there. Rather extraordinary if it is true, I've yet to see appropriately compelling evidence of this. Based on what evidence I do see I think the sequences, at least the ones I've read so far, are probably "mostly right", interesting and perhaps marginally useful to very peculiar kinds of people for ordering their lives.
I also think the sequences are badly-organized and you should just read them chronologically instead of trying to lump them into categories and sub-categories, but I digress.
I think I agree with this.
Many-worlds is a clearly explicable interpretation of quantum mechanics and dramatically simpler than the Copenhagen interpretation revered in the mainstream. It rules out a lot of the abnormal conclusions that people draw from Copenhagen, e.g. ascribing mystical powers to consciousness, senses, or instruments. It is true enough to use as a model for what goes on in the world; but it is not true enough to lead us to any abnormal beliefs about, e.g., morality, "quantum immortality", or "possible worlds" in the philosophers' sense.
Cryonics is worth developing. The whole technology does not exist yet; and ambitions to create it should not be mistaken for an existing technology. That said, as far as I can tell, people who advocate cryonic preservation aren't deluded about this.
Mainstream science is a social institution commonly mistaken for an epistemology. (We need both. Epistemologies, being abstractions, have a notorious inability to provide funding.) It is an imperfect social institution; reforms to it are likely to come from within, not by abolishing it and replacing it with some unspecified Bayesian upgrade. Reforms worth supporting include performing and publ...
the Copenhagen interpretation revered in the mainstream
Is it? I think that the most widely accepted interpretation among physicists is the shut-up-and-calculate interpretation.
Furthermore, I agree with every essay I've ever read by Yvain, I use "believe whatever gwern believes" as a heuristic/algorithm for generating true beliefs, and don't disagree with anything I've ever seen written by Vladimir Nesov, Kaj Sotala, Luke Muelhauser, komponisto, or even Wei Dai;
Wow. One of these is not like the others! (Hint: all but one have karma > 10,000.)
In all seriousness, being placed in that group has to count as one of the greatest honors of my internet life.
So I suppose I can't be totally objective when I sing the praises of this post. Nonetheless, it is a fact that I was planning to voice my agreement well before I reached the passage quoted above. So, let me confirm that I, too, "stand by" the Sequences (excepting various quibbles which are of scant relevance in this context).
I'll go further and note that I am significantly less impressed than most of LW by Holden Karnofsky's critique of SI, and suspect that the extraordinary affective glow being showered upon it is mostly the result of Holden's affiliation with GiveWell. Of course, that affective glow is so luminous (the post is at, what, like 200 now?) that to say I'm less impresse...
Once we get into the habit of openly professing our beliefs for sake of our micropolitics, we are losing a large portion of our bandwith to pure noise. I realise some amount of group allegiance signalling is inevitable always when an opinion is expressed, even when questions are asked, but the recent post by Dmytry together with this post are too explicit in this manner. The latter is at least honest, but I have voted both down nevertheless.
My "ick" sense is being set off pretty heavily by the idea of people publicly professing their faith in a shared set of beliefs, so this post makes me deeply uncomfortable. At best something like this is a self congratulatory applause light which doesn't add anything, at worst it makes us less critical and leads us further towards the dreaded c word.
I disagree. From checking "recent comments" a couple times a day as is my habit, I feel like the past few days have seen an outpouring of criticism of Eliezer and the sequences by a small handful of people who don't appear to have put in the effort to actually read and understand them, and I am thankful to the OP for providing a counterbalance to that.
I'm pretty sad a post that consists entirely of tribal signaling has been voted up like this. This post is literally asking people to publicly endorse the Sequences because they're somehow under-attack (by whom?!) and no one agrees with them any more. Because some of us think having more smart people who disagree with us would improve Less Wrong? I find glib notions of a "singularity religion" obnoxious but what exactly is the point of asking people here to announce their allegiance to the site founder's collection of blog posts?
This post is literally asking people to publicly endorse the Sequences because they're somehow under-attack (by whom?!) and no one agrees with them any more.
To me it seems more like a spreading attitude of -- "The Sequences are not that much important. Honestly, almost nobody reads all of them; they are just too long (and their value does not correspond to their length). Telling someone to 'read the Sequences' is just a polite way to say 'fuck you'; obviously nobody could mean that literally."
I have read most of the Sequences. In my opinion it is material really worth reading; not only better than 99% of internet, but if it became a book, it would be also better than 99% of books. (Making it a book would be a significant improvement, because there would be an unambiguous ordering of the chapters.) It contains a lot of information, and the information makes sense. And unlike many interesting books, it is not "repeating the basic idea over and over, using different words, adding a few details". It talks about truth, then about mind, then about biases, then about language, then about quantum physics, etc. For me reading the Sequences was very much worth my time,...
There's such a thing as a false dissent effect. (In fact I think it's mentioned in the Sequences, although not by that name. Right now I'm too lazy to go find it.) If only those who disagree with some part or another of the Sequences post on the subject, we will get the apparent effect that "Nobody takes Yudkowsky seriously - not even his own cultists!" This in spite of the fact that everyone here (I assume) agrees with some part of the Sequences, perhaps as much as 80%. Our differences are in which 20% we disagree with. I don't think there's anything wrong with making that clear.
For myself, I agree that MWI seems obvious in retrospect, that morality is defined by the algorithm of human thought (whatever it is), and that building programs whose output affects anything but computer screens without understanding all the details of the output is a Bad Idea.
I think mainstream science is too slow and we mere mortals can do better with Bayes.
Then why aren't we doing better already?
The institutional advantages of the current scientific community are so apparent that it seems more feasible to reform it than to supplant it. It's worth thinking about how we could achieve either.
There's a problem here in that Bayesian reasoning has become quite common in the last 20 years in many sciences, so it isn't clear who "we" should be in this sort of context.
Indeed. For anyone who has worked at all in oil & gas exploration, the LW treatment of Bayesian inference and decision theories as secret superpowers will seem perplexing. Oil companies have been basing billion dollar decisions on these methods for years, maybe decades.
I am also confused about what exactly we are supposed to be doing. If we had the choice of simply becoming ideal Bayesian reasoners then we would do that, but we don't have that option. "Debiasing" is really just "installing a new, imperfect heuristic as a patch for existing and even more imperfect hardware-based heuristics."
I know a lot of scientists - I am a scientist - and I guess if we were capable of choosing to be Bayesian superintelligences we might be progressing a bit faster, but as it stands I think we're doing okay with the cognitive resources at our disposal.
Not to say we shouldn't try to be more rational. It's just that you can't actually decide to be Einstein.
I can't understand this. How could the sequences not be relevant? Half of them were created when Eliezer was thinking about AI problems.
That doesn't mean anything said in them is responsive to Holden's arguments.
I don't see what this adds beyond making LW more political. Let's discuss ideas, not affiliations!
If you agree with everything Eliezer wrote, you remember him writing about how every cause wants to be a cult. This post looks exactly like the sort of cultish entropy that he advised guarding against to me. Can you imagine a similar post on any run-of-the-mill, non-cultish online forum?
It worries me a lot that you relate ideas so strongly to the people who say them, especially since most of the people you refer to are so high status. Perhaps you could experimentally start using the less wrong anti-kibitzer feature to see if your perception of LW changes?
I stand by the sequences too. Except I'm agnostic about MWI, not because it makes no new predictions or another silly reason like that, but because I'm not smart and conscientious enough to read that particular sequence + a textbook about QM. And unlike you I'm sure that Eliezer's answer to the metaethics problem is correct, to the point that I can't imagine how it could be otherwise.
It's true that one possible reason why good contrarians are hard to find is that the group is starting to be cult-like, but another such reason is that the contrarians are just wrong.
I think the Sequences got everything right and I agree with them completely.
Even Eliezer considers them first drafts (which is, after all, what they were: a two-year exercise in writing raw material for the forthcoming book), not things to be treated as received wisdom.
But, hey, maybe he's wrong about that.
I think it was cool of Grognor to make a meta+ contrarian post like this one, and it's a good reminder that our kind have trouble expressing assent. As for me, I see some of the sequence posts I see as lies-to-children, but that's different from disagreeing.
Well, I see it more like teaching someone about addition, and only covering the whole numbers, with an offhand mention that there are more numbers out there that can be added.
Drastic simplification, yes. Lie, no.
My opinions:
MWI seems obvious in hindsight.
Cryonics today has epsilon probability of success, maybe it is not worth its costs. If we disregard the costs, it is a good idea. (We could still subscribe to cryonics as an altruist act -- even if our chances of being successfuly revived are epsilon, our contributions and example might support development of cryonics, and the next generations may have chances higher than epsilon.)
Mainstream science is slow, but I doubt people will generally be able to do better with Bayes. Pressures to publish, dishonesty, congnitive biases, politics etc. will make people choose wrong priors, filter evidence, etc. and then use Bayes to support their bottom line. But it could be a good idea for a group of x-rationalists to use scientific results and improve them by Bayes.
I think our intuitions about morality don't scale well. Above some scale, it is all far mode, so I am not even sure there is a right answer. I think consequentialism is right, but computing all the consequences of a single act is almost impossible, so we have to use heuristics.
People in general are crazy, that's sure. But maybe rationality does not have enough benefit to be bette
This is only tangentially related, but:
Having Eliezer's large corpus of posts as the foundation for LW may have been helpful back in 2009, but at this point it's starting to become a hindrance. One problem is that the content is not being updated with the (often correct) critiques of Sequence posts that are continually being made in the comments and in the comments of the Sequence reruns. As a result, newcomers often bring up previously-mentioned arguments, which in turn get downvoted because of LW's policy against rehashing discussions. Additionally, the fact that important posts by other authors aren't added to the Sequences is part of what gives people the (incorrect) impression that the community revolves around Eliezer Yudkowsky. Having a static body of knowledge as introductory material also makes it look like the community consensus is tied to Eliezer's beliefs circa 2007-2009, which also promotes "phyg"-ish misconceptions about LW.
An alternative would be for LW to expect newcomers to read a variety of LW posts that the community thinks are important rather than just the Sequences. This would show newcomers the diversity of opinion on LW and allow the community's introductory material to be dynamic rather than static.
"I think mainstream science is too slow and we mere mortals can do better with Bayes."
I never understood this particular article of LW faith. It reminds me of that old saying "It's too bad all the people who know how to run the country are too busy driving taxicabs and cutting hair."
I agree that there is quite a bit of useful material in stuff Eliezar wrote.
I reject MWI, reject consequentialism/utilitarianism, reject reductionism, reject computationalism, reject Eliezer's metaethics. There's probably more. I think most of the core sequences are wrong/wrongheaded. Large parts of it trade in nonsense.
I appreciate the scope of Eliezer's ambition though, and enjoy Less Wrong.
"There's been a lot of talk here lately about how we need better contrarians. I don't agree. I think the Sequences got everything right and I agree with them completely." But presumably theres room for you to be incorrect? Surely good contrarians would help clarify your views?
Also, when people talk about contrarians they talk of specific conclusions present in the sequences and in less wrong in general- MWI, cryonics, specific ethical conclusions, and the necessity of FAI. I suspect most visitors to this website would agree with much of the sequences- the best posts (to my mind) are clearly expressed arguments for rational thinking.
You are stating 7 points. It's at least 128 different world views from there, depending on what one agrees.
I am number 25 school member, since I agree with the last and two more.
I doubt, that there is a lot of school number 127 members here, you may be the only one.
(This isn't addressed at you, Thomas.)
For those who might not understand, Thomas is treating agreement-or-not on each bullet point as a 1 or 0, and stringing them together as a binary number to create a bitmask.
(I'm using the 0b
prefix to represent a number written in its binary format.)
This means that 127 = 0b1111111 corresponds to agreeing with all seven bullet points, and 25 = 0b0011001 corresponds to agreeing with only the 3rd, 4th and 7th bullet points.
(Note that the binary number is read from left-to-right in this case, so bullet point 1 corresponds to the "most-significant" (left-most) bit.)
I largely agree with the Sequences, and I also don't care for "low-level rehashes of tired debates", but I am wary of dismissing all disagreement as "low-level rehashes of tired debates".
I think people participating on LW should be familiar with the arguments presented in the Sequences, and if they disagree, they should demonstrate that they disagree despite knowing those arguments. When people fail to do this, we should point it out, and people who repeatedly fail to do this should not be taken seriously.
Thank you for saying this, Grognor! As you say, being willing to come out and say such is an important antidote to that phyg-ish nervous expression.
That is pretty much my view as well. The only substantial disagreements I have with Eliezer are the imminence of AGI (I think it's not imminent at all) and the concept of a "Bayesian" superintelligence (Bayesian reasoning being nothing more than the latest model of thinking to be taken as being the key to the whole thing, the latest in a long line of fizzles).
I think criticism of the OP on the ground of conjunction improbability are unfounded. The components are not independent, and no-one, including the OP, is saying it is all correct in every d...
I regard the Sequences as a huge great slab of pretty decent popular philosophy. They don't really tread much ground that isn't covered in modern philosophy somewhere: for me personally, I don't think the sequences influenced my thinking much on anything except MWI and the import of Bayes' theorem; I had already independently come to/found most of the ideas Eliezer puts forward. But then again, I hadn't ever studied philosophy of physics or probability before, so I don't know whether the same would have happened in those areas as well.
The novel stuff in th...
I agree with everything in this article, except for one thing where I am undecided:
Furthermore, I agree with every essay I've ever read by Yvain, I use "believe whatever gwern believes" as a heuristic/algorithm for generating true beliefs, and don't disagree with anything I've ever seen written by Vladimir Nesov, Kaj Sotala, Luke Muelhauser, komponisto, or even Wei Dai; policy debates should not appear one-sided, so it's good that they don't.
But that is only because I have yet to read anything big not by EY on this site.
I think the sequences are amazing, entertaining, helpful and accurate. The only downside is that EY's writing style can seem condescending to some.
(or I simply do not know enough physics to know why Eliezer is wrong, which I think is pretty unlikely but not totally discountable).
How much physics do you know?
I tentatively accept Eliezer's metaethics, considering how unlikely it is that there will be a better one (maybe morality is in the gluons?)
What about this thread?
I agree. I think that the reason why it seems as though people who think that the sequences are awesome don't exist is partially because of selection bias -- those that disagree with the sequences are most likely to comment about disagreeing, thus lending disproportionate weight on people who disagree with certain parts of the sequences.
I found the sequences to be a life changing piece of literature, and I usually consider myself fairly well read, especially in regards to the general population. The sequences changed my worldview, and forced me to reevalua...
I write this because I'm feeling more and more lonely, in this regard. If you also stand by the sequences, feel free to say that.
Most of your post describes my position as well, i.e. I consider the sequences (plus a lot of posts by Yvain, Lukeprog etc.) to be closer to truth than what I could work out by myself; though that only applies to stuff I'm sure I understand (i.e. not the Metaethics and QM).
I don't technically believe the literal word of the all sequences, but I to follow your general policy that if you find yourself disagreeing with someone smarter than you, you should just belive what they do, that this community way overvalues contrarians, and that the sequences is mostly right about everything that matters.
To me personally, the most useful part of the Sequences is learning to ask the question "How does feel from the inside?" This dissolves a whole whack of questions, like the proverbial [lack of ]free will, reality of belief and many other contentious issues. This ties into the essential skill of constructing a mental model of the person asking a question or advancing an argument. I find that most participants on this forum fail miserably in this regard (I am no exception). I wonder if this is on the mini-camp's agenda?
(In the spirit of making forceful declarations of stances.)
I am a utilitarian consequentialist
Is this prescribed by the sequences? If so (and, I suppose, if not) I wholeheartedly reject it. All utilitarian value systems are both crazy and abhorrent to me.
and think that if allow someone to die through inaction, you're just as culpable as a murderer.
Your judgement is both distasteful to me as a way of evaluating the desirability of aspects of the universal wave function and highly suspect as a way of interacting practically with reality. Further, th...
I mostly agree with this, with one reservation about the "mainstream science is too slow" argument.
If we understood as much about a field as scientists, then we could do better using Bayes. But, I think it's very rare that a layman could out-predict a scientist just using Bayes without studying the field to the point where they could write a paper in that field.
I think mainstream science is too slow and we mere mortals can do better with Bayes.
I don't understand what this means at all. Who are these "mere mortals" ? What are they doing, exactly, and how can they do it "better with Bayes" ? If mainstream science is too slow, what will you replace it with ?
Uh... This "phyg" thing has not died yet?
I wonder if anyone actually thinks this is clever or something. I mean, inventing ways to communicate that are obscure to the public is one of the signs of a cult. And acting like you have something to hide just makes people think you do. So while this might give us a smaller signature on Google's radar, I cringe to think of actual human newcomers to the site, and their impression.
Also, a simple call to sanity: the blog owner himself has written posts with cult in the very title. I just put "cult"...
I stand by the sequences. When I first found them, I knew that they were/are the single most comprehensive treatment of epistemic thinking that I was going to encounter anywhere.
Dropping out of lurk-mode to express support and agreement with your tone and aim, though not necessarily all of your points separately.
ETA: Argh, that came out clunky. Hopefully you grok the intent. Regardless; I know to whom I owe much of my hope, kindness and courage.
I agree with your first 2 bullet points. I agree with the third, with the caveat that doing so leads to a greater risk of error. I'm a consequentialist with utilitarianism as a subset of my values. I think "culpable" refers to how society should treat people, and treating people who fail to save lives as murderers is infeasible and unproductive. I choose TORTURE over SPECKS in the relevant thought experiment, if we stipulate that there are 3^^^3 distinct possible people who can be specked, which in reality there aren't. I want to sign up for cry...
I enjoyed reading (most of) the Sequences, but I'm stunned by this level of agreement. I doubt even Eliezer agrees with them completely. My father always told me that if I agreed with him on everything, at least one of us was an idiot. It's not that I have a sense of "ick" at your public profession of faith, but it makes me feel that you were too easily persuaded. Unless, of course, you already agreed with the sequences before reading them.
Like Jayson_Virissimo, I lean towards the Sequences. Parts I reject include MWI and consequentialism. In som...
My sense is that the attitude presented in the article and in Yvain's linked comment is problematic in somewhat the same way as asexual reproduction is a problematic as an evolutionary strategy.
Agreeing with something so big (but coherent inside itself) wholesale is a normal reaction if you consume it over small time and it doesn't contain anything you feel is a red flag. It can persist for a long time, but it can easily be either an actual agreement or an agreement "with an object" without integrating and cross-linking the ideas into your bigger map.
Fortunately, it looks like "taboo" technique is quite effective on a deeper level, too. After avoiding using terms, notions, arguments from some body of knowledge in your thinking...
Any belief that I can be argued into by a blog post I can also probably be argued out of by a blog post.
That sounds like you're saying that what beliefs you can be argued into is uncorrelated with what's true. I don't think you mean that.
"This of course makes me a deranged, non-thinking, Eliezer-worshiping fanatic for whom the singularity is a substitute religion."
Yes, it does. Seek professional help.
Edit, May 21, 2012: Read this comment by Yvain.
- Peter de Blanc
There's been a lot of talk here lately about how we need better contrarians. I don't agree. I think the Sequences got everything right and I agree with them completely. (This of course makes me a deranged, non-thinking, Eliezer-worshiping fanatic for whom the singularity is a substitute religion. Now that I have admitted this, you don't have to point it out a dozen times in the comments.) Even the controversial things, like:
There are two tiny notes of discord on which I disagree with Eliezer Yudkowsky. One is that I'm not so sure as he is that a rationalist is only made when a person breaks with the world and starts seeing everybody else as crazy, and two is that I don't share his objection to creating conscious entities in the form of an FAI or within an FAI. I could explain, but no one ever discusses these things, and they don't affect any important conclusions. I also think the sequences are badly-organized and you should just read them chronologically instead of trying to lump them into categories and sub-categories, but I digress.
Furthermore, I agree with every essay I've ever read by Yvain, I use "believe whatever gwern believes" as a heuristic/algorithm for generating true beliefs, and don't disagree with anything I've ever seen written by Vladimir Nesov, Kaj Sotala, Luke Muelhauser, komponisto, or even Wei Dai; policy debates should not appear one-sided, so it's good that they don't.
I write this because I'm feeling more and more lonely, in this regard. If you also stand by the sequences, feel free to say that. If you don't, feel free to say that too, but please don't substantiate it. I don't want this thread to be a low-level rehash of tired debates, though it will surely have some of that in spite of my sincerest wishes.
Holden Karnofsky said:
I can't understand this. How could the sequences not be relevant? Half of them were created when Eliezer was thinking about AI problems.
So I say this, hoping others will as well:
I stand by the sequences.
And with that, I tap out. I have found the answer, so I am leaving the conversation.
Even though I am not important here, I don't want you to interpret my silence from now on as indicating compliance.
After some degree of thought and nearly 200 comment replies on this article, I regret writing it. I was insufficiently careful, didn't think enough about how it might alter the social dynamics here, and didn't spend enough time clarifying, especially regarding the third bullet point. I also dearly hope that I have not entrenched anyone's positions, turning them into allied soldiers to be defended, especially not my own. I'm sorry.