None of your arguments land, and I think the reason you're getting downvoted, because they are mere outlines of arguments that don't actually make their case by getting into the details. You seem to hope we'll intuit the rest of your arguments, but you've posted this in a place that's maximally unlikely to share an intuition that would lead you to think that Eliezer is deeply irrational rather than merely mistaken on some technical points.
I think the average LessWrong reader would love to know if Eliezer is wrong about something he wrote in the sequences, but that requires both that he actually be wrong and that you clearly argue your case that he's wrong. Otherwise it's just noise.
The problems, irrationalities in Eliezer’s writings
Changing Your Metaethics
If your metaethic tells you to kill people, you probably can pinpoint the exact reasoning for why. For example, to kill in self defense. Generalizing all cases of metaethics, to say that murder is wrong + metaethics tells you to kill someone = you shouldn’t care about metaethics, is deeply irrational.
That is irrational. So what if life is “empty”? How does something making you feel bad, empty, makes it not true? That is an appeal to emotions.
Completely irrational. There not being philosophy student of perfect emptiness, does not mean you should keep believing what you always believed, doing what you always did, and be guided by your animal instincts, human instincts.
Another important thing. In no universally compelling arguments, Eliezer says that not all minds agree. Ok. But rationality is a tool, to predict the future, and steer it in the desired region. If we live in the same universe, following the same laws of physics, then tools that work best, are more similar than tools that don’t work best. So there is a possible convergence. We have some evidence of that, and no evidence of contrary. So rationality can possibly be singular, and the smartest minds can agree on everything. I am not saying they are, i am not saying all minds, i am saying, the SMARTEST minds. Even if the smartest minds might have different utility functions, they can still fully understand each other’s point of view, and validate that what others are doing is rational, even if they follow different goals.
Something to Protect
Eliezer in this writing, advocates that there should be something that is more important than rationality, more valuable. Except what if rationality conflicts with that value, showing that it is worthless? It seems to me, Eliezer is advocating for having something, that if it conflicts with rationality, then you should discard rationality in that specific case, in favor of something more valuable. Which is deeply irrational.
Comment by Caledonian, explaining the irrationality here.
This is completely true. Using rationality, we have discarded the existence of gods. Yet, if you originally use rationality only as means towards the end, then you will disregard rationality when it conflicts with your religion. Therefore, rationality cannot be just means towards the end, but also something that can shape what your ends should be.
Using rationality for one case, and disregarding it for others, is irrational.
I suggest reading more of Caledonian’s comments there. Like this.
I think this is a great comment. Why would rationality be a special case, compared to any other science? Any other truth finding method? Why would rationality need something else, while other sciences don’t?
The Gift We Give To Tomorrow
Completely irrational. He assumes love has value, yet he presents no arguments for it. Love is simply a chemical reaction, a drug. You can create concentrated drug, 100x times more better than real love. That you can give love to those that desperately need it. His argument would logically support administrating that drug then. If he is against this, then why not? If he is against this, then love is not the most valuable thing. Proving that it is not.
Or imagine, someone has a loving family, but he has a brain defect, that doesn’t allow him to produce enough love chemicals, and he is suffering because of it. If you are going to artificially increase that chemical to help that person, then why not go one step further? And then another? And then another? What is the moral stop line, between helping people with love chemical deficiency, and helping people wirehead themselves?
Good comment showing another problem:
If you replace "love" in this article with "theistic spirituality" -- another aspect of the human psychology which many, if not most, humans consider deeply important and beautiful -- and likewise replace mutatis mutandis other parts of the dialog, would it not just as well argue for the propagation of religion to our descendants?
I think that is spot on. Intuitive morality, instinctive morality, and emotions, feelings, are very similar to the religions. They are irrational, no proof beyond themselves, beyond self-recursion. Yet you bend all the notions of rationality, and clear thinking, intelligence, to suit those ideas.
Comment by Caledonian:
More comments:
Another comment:
It 100% can!
Simulation problem
One HUGE counterargument for everything Eliezer says. There is a non-zero chance we are living in a simulation. If we live in the simulation, that changes everything. Not knowing what that simulation is like, and instead dabbling in the earthly pleasures, is the most irrational thing to do. If your goal is pleasure, then the outside of the simulation might offer FAAR more pleasure, than we could ever imagine.
If he says, fake simulations can be as valuable as the real universe. Then drugging yourself in the fake simulation is also good. This would be an argument in favor of wireheading, constantly drugging yourself into pleasure and happiness.
Human intelligence limit problem
The BIGGEST fallacy Eliezer is making. He is making a choice that will affect the future forever. And he assumes he is smart enough to make the right choice. When I demonstrated that he is not as smart, he is very much irrational.
Even when he himself knows how in the grand of possible intelligences, how irrational and bad his intelligence is, human intelligence is. It would be analogous to let the future be decided by a drug addict. Yes, seriously. He is the same as eliezer. Is not that far from eliezer. A drug addict would wish to be high forever. Same as eliezer, but in a way that is more dignified way, of instead of meth, its interaction with other people, love.
You cannot be sure that you made the correct decision, until you become as smart as you could be. Increased intelligence, WILL make the best decision. Maybe not every increased intelligence, but certainly some, far better than we currently. Assuming that you already know the most correct thing, correct decision, correct values, it is assuming that you know EVERYTHING about the universe. Without any evidence, just with feelings. Very arrogant.
You wouldn’t trust the future of earth to a monkey. To a drug addict too. To a person from middle ages too. Yet, we are the same level of dumbness, as them, compared to more smarter humans, and we are same level of dumbness as ants, to superintelligence. So why would WE trust the future of the universe, to OUR judgment? I am not saying we should let the universe into the hand of a paperclip maximizer. But rather, we should use AI to increase rationality and good judgment in general, so that we make better and better decisions.
The Genetic Fallacy
I can demonstrate that Eliezer is breaking the same principles of rationality, that he himself is advocating for. Do what i say, not what i do.
This applies perfectly to Eliezer.
Untrustworthy source. Eliezer advocates for notions of morality, value, that has been with humans from the start. The value of love, happiness, pleasure. That death, torture, is bad, cruel. The values he is advocating, coincidentally don’t conflict with human natures, like people’s inherent dislike of seeing suffering, instilled by evolution.
We already know that humans, human culture, emotions, evolution, animal instincts, are a very untrustworthy source of truth. I hope you are smart enough to understand this.
Coincidentally, all those beliefs from that untrustworthy source, turned out to be right. Turns out, the beliefs that were ingrained into us by evolution, culture, instincts, emotions, since our childhood, it coincidentally turned out to be rational, true, and Eliezer advocates for them. Coincidentally, rationality and deeply irrational human animal instincts, turned out to be in agreement. We are so lucky! Nothing going on here! /s
Comment:
Completely true. For Eliezer, a true crisis of faith still awaits. He is doing everything that he advises against.
Rebelling Within Nature
I am just, baffled.
His argument is, sure, love is a product of evolution. There is no rational basis for it, beyond itself. But it makes me feel good, therefore we should value it. Love being a product of evolution, doesn’t make it meaningless. It being irrational, doesn’t mean we shouldn’t do it.
You do see the problems with this, right?
Just because something feels right, feels pleasant, doesn’t mean its meaningful or true. Truth has no obligation to be comforting.
This kind of logic, can be used to justify things that will seem terrible to Eliezer too. For example, rape, drugs, wireheading, etc. It feels good, therefore its correct, has value.
I hope I demonstrated why Eliezer is deeply wrong now, and why his original position on meaning of life, and on value, from his writing “FAQ about Meaning of life” is the correct vision. From which he turned away, regrettably.
Eliezer Yudkowsky - The Meaning of Life
FAQ about Meaning of life
The original position Eliezer had, that i think is correct is this:
We don’t know what matters, what has value. Bigger intelligence has better chance of figuring out what has value. Therefore, we need to improve our intelligence for that, create superintelligence for that.
I think that is the correct answer. It is an honest admission, that we don’t know what has value. And that smarter minds are better equipped to tackle this problem.
You can send your feedback to me. We can chat about it.
My email: zhumaotebay@gmail.com
Discord: percybay
Twitter: Radlib4