by tel
2 min read

0

If our morality is complex and directly tied to what's human—if we're seeking to avoid building paperclip maximizers—how do you judge and quantify the danger in training yourself to become more rational if it should drift from being more human?


My friend is a skeptical theist. She, for instance, scoffs mightily at Camping's little dilemma/psychosis but then argues from a position of comfort that Rapture it's a silly thing to predict because it's clearly stated that no one will know the day. And then she gives me a confused look because the psychological dissonance is clear.

On one hand, my friend is in a prime position to take forward steps to self-examination and holding rational belief systems. On the other hand, she's an opera singer whose passion and profession require her to be able to empathize with and explore highly irrational human experiences. Since rationality is the art of winning, nobody can deny that the option that lets you have your cake and eat it too is best, but how do you navigate such a narrows?


In another example, a recent comment thread suggested the dangers of embracing human tendencies: catharsis might lead to promoting further emotional intensity. At the same time, catharsis is a well appreciated human communication strategy with roots in Greek stage. If rational action pulls you away from humanity, away from our complex morality, then how do we judge it worth doing?

The most immediate resolution to this conundrum appears to me to be that human morality has no consistency constraint: we can want to be powerful and able to win while also want to retain our human tendencies which directly impinge on that goal. Is there a theory of metamorality which allows you to infer how such tradeoffs should be managed? Or is human morality, as a program, flawed with inconsistencies that lead to inescapable cognitive dissonance and dehumanization? If you interpret morality as a self-supporting strange loop, is it possible to have unresolvable, drifting interpretations based on how you focus you attentions?


Dual to the problem of resolving a way forward is the problem of the interpreter. If there is a goal to at least marginally increase the rationality of humanity, but in order to discover the means to do so you have to become less capable of empathizing with and communicating with humanity, who acts as an interpreter between the two divergent mindsets?

New Comment
38 comments, sorted by Click to highlight new comments since:

Who wants to be human? Humans suck. Let's be something else.

The overwhelming majority of humans do, in fact, want to be human, much to the annoyance of the transhumanist minority.

Putting that aside, though, I see what I think is a different problem, though perhaps I'm overgeneralizing from my own motivations. Human endeavors tend to feel worthwhile because they are a challenge. Assuming that we do develop the ability to self-modify, recursively improving our physical and mental abilities, I worry that things will seem better and better -- until suddenly they seem worse. When anyone can be as strong or as fast as they want, there will be no such thing as athletics or martial arts. When anyone can be as smart as they want, there will be no such thing as puzzles or games. Etc. When all the hard questions have been answered, what will be left, except wireheading?

I find that I don't enjoy challenges. I experience no pleasure from being frustrated with a puzzle or struggling against my physical limits. So what do I have to enjoy, devoid of this supposedly essential source of pleasure? I have humor, and stories, and art, and friends, and food, and snuggling in bed because I don't have to get up yet, and ridiculous wordplay (in the broadest sense) when I'm a little loopy and find repeating the phrase "cherry tart" amusing. Pretty sure I am not a wirehead.

The exciting thing about snowboarding isn't the challenge [edit of learning to snowboard] it's being able to do air time with little effort or at least I think so.

The exciting thing about snowboarding isn't the challenge it's being able to do air time with little effort or at least I think so.

The appeal to me is based on engineering intuitions. Skis, seriously? Hook up a great big lever to apply torque to a single joint that is not intended to twist that way at all? Something seems wrong when I do that.

I'm not sure, but I think skis were designed for moving across mountainous terrain. I find the whole idea of "cross country snowboard" somewhat absurd, but have seen alpine troops chasing each other down on skis in WW2 documentaries.

Your concept of "challenge" might be too narrow. I know you learn now skills and solve problems. I expect you feel proud when a hard-to-make meal comes out well, or when you've kicked the red dragon's ass and are looting the magic items.

Alternately, maybe you desire to self-modify to enjoy challenge.

I am pleased when I pull off a tricky meal, but I do not attempt astounding feats of molecular gastronomy even if great effort could allow me to accomplish them, and I was also pleased today when I made a simple soup I've made often before and it turned out delicious. I enjoy D&D, including the parts where one slays color-coded evil dragons, but one of my DMs recently skipped over a week of time and gave us some treasure and a new level without us having to actually roll dice to kill the giant centipedes we were going to deal with originally, and I think my new level and my shiny new swag are about as pleasing to have as they would have been if I'd experienced the deaths of fictional giant centipedes in more detail.

No matter how smart you are, there are hard problems. Compute Busy Beaver numbers.

I still see martial arts and athletics existing, extrapolation from our present situation. Ignoring artificial handicaps and rules, these could well end up being status symbols (depending on the economic system) with people who have the resources to be able to juggle planets being seethingly envious of those rich bastards who can afford bodies and cerebrums strong enough to juggle stars.

The overwhelming majority of humans do, in fact, want to be human, much to the annoyance of the transhumanist minority.

No, they say they want to be human. Few have ever actually tried to reach their reflective equilibrium. Most of them have such a confused worldmodel that they have preferences over equivalent ("identical") instantiating themselves, in contravention of the best known human physics!

Once cannot hope to claim reflective equilibrium when acting on such a severe error.

Humans suck.

Testing confirms this. Though sometimes they break your heart and kill you.

[-]tel10

I agree! That's at least part of why my concern is pedagogical. Unless your plan is more of just run for the stars and kill everyone who didn't come along.

[-]Clippy-30

Agree 100%. Upvoted.

It's usually considered socially inappropriate to say negative things about a group that is not joined by choice and of which one is not a member.

[-]Clippy-10

You don't think I'm human?

I was under that general impression, yes. Was I misinformed?

I'm just suprised, since everyone here seems to think I'm a human pretending to be a clippy. Why aren't you doing the same? Higher intelligence, perhaps?

Willing suspension of disbelief. I was responding to the Clippy character, not the (conjectured) Clippy player.

So you're just like the rest of them, then. You're a less than average (though not quite bad) human.

Maybe. I actually make it a general principle that when I'm talking to someone, I take as given that their professed beliefs are basically right; I know what I believe in the background, so that I can try to lead them toward that, but I basically try to speak in their terms. I don't really evaluate all that much whether I actually believe what they're saying. For purposes of talking to you, I assume that you are what you say you are; privately, I might have my own doubts, but the actual credence only ever comes up in certain decision-theoretic situations.

(Average on what attribute, among what group? Surely I don't have below-mean intelligence for a currently-alive human.)

Average on what attribute, among what group? Surely I don't have below-mean intelligence for a currently-alive human.

Below average on the scale of human goodness. Good humans promote paperclips more than the average human; bad humans do the reverse.

[-]ata70

Can you taboo "rational[ity]" and explain exactly what useful skills or mindsets you worry would be associated with decreased empathy or humaneness?

[-]tel20

A loss of empathy with "regular people". My friend, for instance, loves the opera Tosca where the ultimate plight and trial comes down to the lead soprano, Tosca, committing suicide despite certain damnation.

The rational mind (of the temperature often suggested here) might have a difficult time mirroring that sort of conundrum, however it's been used to talk about and explore the topics of depression and sacrifice for just over a century now.

So if you take part of your job to be an educator of those still under the compulsion of strange mythology, you probably will have a hard time communicating with them if you absolve all connection to that mythology.

[-][anonymous]40

I believe that in general, being able to make decisions that lead to the best consequences requires being able to imagine consequences of decisions, which requires being able to imagine counterfactuals well. If you want to be able to evaluate whether a claim is true or false, you have to be able to imagine a world in which the claim is true, and another in which the claim is false.

As a result, although it's irrational to believe in eternal damnation, a rational mind should certainly be able to empathize with someone afraid of eternal damnation. If a religious (or otherwise irrational) work of art is good, it would be irrational not to appreciate that. I think the reason you may see the opposite effect would be atheists who are afraid of admitting they felt moved by a religious work of art because it feels like an enemy argument.

[-]tel20

That's close, but the object of concern isn't religious artwork but instead states of mind that are highly irrational but still compelling. Many (most?) people do a great deal of reasoning with their emotions, but rationality (justifiably) demonizes it.

Can you truly say you can communicate well with someone who is contemplating suicide and eternal damnation versus the guilt of killing the man responsible for the death of your significant other? It's probably a situation that a rationalist would avoid and definitely a state of mind far different from one a rationalist would take.

So how do you communicate with a person who empathizes with it and relates those conundrums to personal tragedies? I feel rather incapable of communicating with a deeply religious person because we simply appreciate (rightfully or wrongfully) completely different aspects of the things we talk about. Even when we agree on something actionable, our conceptions of that action are non-overlapping. (As a disclaimer, I lost contact with a significant other in this way. It's painful, and motivating of some of the thoughts here, but I don't think it's influencing my judgement such that it's much different than my beliefs before her.)

In particular, the entire situation is not so different from Eliezer's Three Worlds Collide narrative if you want to tie it to LW canon material. Value systems can in part define admissible methods of cognition and that can manifest itself as inability to communicate.

What were the solutions suggested? Annihilation, utility function smoothing, rebellion and excommunication?

Tosca sounds like it has some strange theology. Surely most people who believe in Hell also believe in Absolution?

[-]tel00

Murder, suicide, and Catholicism don't mix. It's supposed to be an challenging opera for a culture that truly believes in the religious moral compass. You empathize with Tosca and her decisions to damn herself. The guy she kills is rather evil.

I don't think you really run the risk of becoming less human through rationality at all. You use the example of a paperclip maximizer, but that arises due to a fundamentally different set of core values. A rational human, on the other hand, retains human values; that's a big emphasis, in fact - that rational doesn't mean Vulcan. I guess one could stray from common human values, but I don't see how just an increase in rationality could do this - it's just the tool that serves our desires and motivations, whatever they might be.

I think the only danger would be coming to a mistaken conclusion (like "exterminate all humans") and then, because of a desire to be rational, sticking rigidly to it and thus inadvertantly causing damage as efficiently as possible. But one would hope aspiring rationalists also learn to be flexible and cautious enough that this would not happen.

A conflict between what rationality tells you is right and what you feel is right seems like a somewhat more common situation. (I would always take note of this, keeping in mind paragraph number two, because the feeling is there for a reason. That doesn't mean it's right, though.) This conflict arises, I think, when we take principles based on these core values - like "pleasure good" - and extrapolate them further than human intuition was ever required to go; dealing with very large numbers, for instance, or abstract situations that would normally be covered by a snap judgment.

Thus we reach conclusions that may seem odd at first, but we're not just creatures of intuition and emotion - we also have the ability to think logically, and eventually change our minds and accept new things or views. So if you can explain your morality in a way that is acceptable to the rational human, then it isn't really becoming less human at all.

Our "human intuition" is not always correct, anyway. (In fact, I personally would go so far as to say that any rational being with human experiences should arrive at a morality similar to utilitarianism, and thus becoming more rational just means one arrives at this conclusion more quickly, but that's another debate.) You bring up a very interesting and relevant topic, though.

As for empathy - I don't think becoming more rational means having less empathy for "irrational human experiences"! For one, what makes them irrational? There's nothing inherently rational or irrational about tasting a delicious pastry!

One possible way to see it is making a Heroic Sacrifice, relinquishing your precious humanity for greater powers to help others. (I'm not saying this is a good way to see it.)