Your friend disagreed with you about a question of value. Why would that make them confused? Is there a single objective value system that all non confused people should adhere to?
I think you could at least argue that it's not about values but about a lack of imagination. If the problem is really that
[...] you'll eventually learn all you can learn
this seems to ignore all sorts of ways to solve that problem, such as
It's conceivable that the person would change their view if they thought about it for long enough & immortality were on the table.
Your three examples are all kind of Wireheady. If The Friend has preferences for constant learning "and* against fakeness, that would be a consistent set of preferences that would imply rejection of immortality.
To put it another way, you may be assuming the Friend is actually a dulness minimiser, who mistakenly thinks that only learning reduces dulness. But maybe learning is their actual terminal value.
What indicated to me that he is deeply confused is that he believes a) in an afterlife, and b) that extending this life interrupts subsequent lives, which is related to (a). If it was simply a matter of not valuing a longer life I wouldn't have the same response.
Rafael mentions the issue of lack of imagination, where my friend is worried that you'll eventually learn all you can learn and life will become dull. To me, this indicates confusion, but not the type of deep confusion that would make me sort someone into the stupid bucket.
I can see the contradiction between one afterlife and multiple ones. I don't see the the issue of "interruption".
Assuming hypothetically that you do cycle from one life to the next, why also assume that unnaturally extending your current life will negatively interfere with/interrupt the subsequent life?
The standard new age theory of reincarnation is that each lifespan is intended to teach you one specific thing. So,, it works like courses or units at a university..if you don't complete the unit ,you don't get the credit. And, once you have got the credit, there is no point in hanging around I don't suppose that many rationalists would regard that as true, but it is consistent.
Even if both things are consistent with a broader theory, they still seem like distinct errors. As a different example, "I'll go to hell if I sin" and "Homosexuality is a sin" are both consistent with the broader theory of Christianity, but I think they're still distinct errors.
I don't think it matters too much though. The purpose of establishing them to be distinct errors is to establish that he is deeply confused, but either one of them alone (well, b wouldn't make sense without a) would be more than sufficient, right?
Again, this boils down to using "confused" to mean "has an opinion I disagree with".
Edit: if you are in a context where you can achieve correctness beyond mere consistency, by all means do so. But transhumanism and Life extension are not that context ,because they are so entangled with values and preferences.
Imagine that we take someone who has a bunch of ridiculous beliefs and we remove them one by one. And suppose that we say that this person is becoming Less Wrong. I don't think this is what the name "Less Wrong" was really intended to mean, but let's run with it anyway.
For what it's worth, I think that is what it was intended to mean. Rather than having the yardstick of other people, and asking "am I more right than other people?", one has the yardstick of one's past self, and asks "am I less wrong than I was before?". This specifically focuses your attention on ways that you might have a bunch of ridiculous beliefs, and be just as ignorant of them as the people around you whose ridiculous beliefs you find it easy to spot.
His response was that too long of a life is a curse, because you'll eventually learn all you can learn
But how many years would it take to get there? I think that 100 is not enough, even if I could somehow retain perfect health. And I am not even thinking about "all" I could learn, but only about math and computer science.
Imagine that we take someone who has a bunch of ridiculous beliefs and we remove them one by one. And suppose that we say that this person is becoming Less Wrong. ... With this idea in mind, what would it mean to become More Right?
You can only be right about something, so the answer is acquiring correct domain knowledge. Preferably without having the ridiculous beliefs distort your process of evaluating evidence. Whether it is evidence about quantum physics, baking cookies, curing cancer, or understanding how the society works.
Becoming supremely rational per se would not give you the knowledge about things. But it would make learning faster, because you would notice confusion, recognize fake explanations, better generalize, notice the alternatives, recover from your mistakes upon getting further evidence, etc. So it would take you less time to learn things, and to discover new things. It would also allow you to notice more opportunities for improvement.
ridiculous beliefs.
Is there a longer list?
But most people who have ridiculous beliefs aren't humble about them.
It might be worth noting that people who are humble about a belief are less likely to bring it up.
I was asking a different question - beyond 'religion' and 'life will get boring', what other silly things do people believe (that are incorrect), or is that it?
There are certainly lots of other silly things that are incorrect that people believe. Wikipedia's list of fallacies is a good place to start.
Additive and subtractive manufacturing processes describe a much wider range of possible things to build than just one of the two.
I think I understand what you are talking about, and I have the same feeling. Somehow it feels like that when I am thinking or talking about something, I explore the ramifications, consequences of the ramifications, and how I think they will interact and change (with) the other concepts. It requires imagination and a will to let the ideas take you somewhere, instead of trying to control the outcomes based on what you want to believe.
About less wrong/more right: it makes sense. The doctor trying to be "more right" maybe means that he is trying to optimize what makes him successful in his practical life, and understanding the evidence problem simply won't make him more successful. The optimization process, "via positiva", means he just takes what works and optimizes it. Trying to be less wrong is, instead of just optimizing what works, you keep trying new ideas, and instead of optimizing what you already know, you cut off what you know to be false, "via negativa".
I'm not sure what to do to be humble or support it in others. I try to adhere to a number of haphazard rules, like "don't spread facebook posts that are supposed to be news but also asking the readers to express their feelings on the subject". Maybe it sums to something.
But being "positively" humble, actively trying to not just not-sidestep but go forward... feels weird. As if the truth is a blade to be looked at only as it descends for the kill and never before; at the moment when whatever you do, it won't matter.
(It happens in fiction, even in good fiction trying to improve people's thinking. Because it's just too satisfying.)
I can be relatively humble when I remember the blade, but it's neither a permanent solution nor particularly safe.
I don't understand your blade analogy. In what way is positive humility analogous to ignoring a blade while it's sheathed/merely in motion?
(Not the commenter above.)
Getting good at something isn't just about doing it right when you do it, it's also about doing it for practice/looking for opportunities.
Sidestep-humility:
Don't be foolish/ignorant.
Positive humility:
Seek out knowledge.
In what way is positive humility analogous to ignoring a blade while it's sheathed/merely in motion?
Not having/using 'positive humility' (while having the other kind) is like ignoring a blade and trying to cut vegetables up with your hands.
You are definitely not understanding the analogy made above, nor my confusion. The analogy was meant to illustrate a failure mode of having only positive humility. EDIT: This apparently correctly understood what was meant, but not what was said.
Maybe some way exists, but that's not what I mean.
I like to think about the blade as being always in motion, since even when it's sheathed it's moving along to being drawn. PH is analogous to not ignoring it. It's just easier when you can see it coming.
That is the opposite of what you said.
But being "positively" humble, actively trying to not just not-sidestep but go forward… feels weird. As if the truth is a blade to be looked at only as it descends for the kill and never before; at the moment when whatever you do, it won’t matter.
Do you see how this phrasing is specifically claiming that positive humility is analogous to ignoring the blade?
Sorry, English is not my native language. I meant that X feels like Y, feels as if Y is Z. I'll try to be more careful.
Well, you've already gone off-target from that goal, because what you said here makes even less sense. I can't tell what you mean by "X feels like Y, feels as if Y is Z", even so far as to judge it true or false.
Do you mean that "X feels like a situation in which Y is Z"?
Do you mean "The way X feels like Y is similar to the way Y feels like Z"?
Do you mean "X makes me feel like Y is like Z"?
None of these interpretations actually makes your comment make sense, because none of them are analogous to the original positive-humility/truth-blade simile, but I can't even tell which of these you intended.
Before I came across LessWrong, I felt like I was the only sane person in the world.1 This excerpt from HPMOR illustrates those feelings well:
Here's an example. I have a friend who I study poker with. When we analyze hands, we dive deep into probabilities and expected values and game theory. We've done this hundreds and hundreds of times, and he's really good at doing it. He also seems quite sane overall.
The other day I tried something different with him. I told him that I estimate that the risk of dying from the coronavirus costs $6/hr in EV for someone playing live poker and asked what his estimate is. It's just probabilities and expected values, not radically different from when we analyze a poker hand together. I also said that as you value life more (yours and other peoples), it becomes more and more costly, and that I personally am hoping for some sort of life extension (anti-aging, cryonics, AI takeoff, etc).
His response was that too long of a life is a curse, because you'll eventually learn all you can learn, and because you stop the evolution of your consciousness when you unnaturally extend this life. Instead, you should be moving or cycling to whatever is next.
It caught me off guard. I genuinely wasn't expecting that from him. But alas:
This has been discussed many times before.
Outside The Laboratory:
Changing the Definition of Science:
Doctor, There are Two Kinds of "No Evidence":
I've always been really frustrated by this. And this frustration has lead me to sort people into two different buckets: smart, and stupid. You're smart if you don't have any of these ridiculous beliefs. Otherwise, you're stupid.2
Well, I guess I can give you a pass if you are truly open minded about your ridiculous belief and genuinely demonstrate virtues like humility.3 For example, if the doctor in the above example said, "Huh, maybe I'm wrong about the way I'm thinking this. Let's discuss it.", I'd give that a pass and put him in the smart bucket. But most people who have ridiculous beliefs aren't humble about them. They might be humble about other things, but not about the ridiculous beliefs. They're usually not interested in the idea that these beliefs may be wrong.
This is a pretty low bar in my opinion. You don't have to be knowledgeable, creative, insightful, skilled, etc. As long as you avoid ridiculous beliefs, or are at least open-minded about them, I categorize that as being in the smart bucket. But still, even with this low bar, too few people are sane enough to reach it.
Imagine that we take someone who has a bunch of ridiculous beliefs and we remove them one by one. And suppose that we say that this person is becoming Less Wrong. I don't think this is what the name "Less Wrong" was really intended to mean4, but let's run with it anyway. With this idea in mind, what would it mean to become More Right?
Well, what is it that made Robert Aumann successful? What about the doctor? What made him successful, despite his not understanding what evidence is? Maybe the answer to these questions is the process of becoming More Right.
I'm not sure how to get more precise about this distinction, or about what exactly the implications are, but it seems useful to distinguish between becoming Less Wrong and becoming More Right.
Footnotes:
1) I suspected that there were others and that I just hadn't come across them yet. Particularly some of the Nobel Prize intellectual types, but I didn't know enough about them to be able to tell whether they'd eventually "demonstrate that something deep inside their brain was confused".
2) Of course I don't mean to imply that this is the way to draw the line between smart and stupid. Part of it is that it helps me quell my frustration, but the bigger part of it is that I think it's a useful way to draw the line. People in one bucket behave in importantly different ways from the people in the other bucket. Again, there are certainly other useful ways to draw lines.
3) I was going to make a stronger statement and require that other virtues be present as well, but maybe they aren't needed. Imagine someone who has the virtue of humility but not empiricism. You could explain empiricism to them, and then they'll have that virtue (to some extent anyway), almost like a character adding a badge in a video game. Even if they don't have the virtue of lightness or curiosity, if they are truly humble, maybe you can explain lightness and curiosity as well. Maybe humility is the seed that everything else can grow from.
4) I agree with Ruby's interpretation that the name is trying to point at the fact that no human is anywhere close to being right about everything. That we all have a long ways to climb on the intelligence staircase.