Less Wrong is a community blog devoted to refining the art of human rationality. Please visit our About page for more information.
I. Humans are emotion-feeling machines.
I don’t mean that humans are machines that happen to feel emotions. I mean that humans are machines whose output is the feeling of emotions—“emotion-feeling” is the thing of value that we produce.
Not just “being happy." Then wireheading is the ultimate good, rather than the go-to utopia-horror example. But emotions must be involved, because everything else one can do are no more than a means to an end. Producing things, propagating life, even thinking. They all seem like endeavors that are useful, but a life of maximizing those things would suck. And the implication is that if we can create a machine that can do those things better than we can, it would be good to replace ourselves with that machine and set it to reproduce itself infinitely.
I recently saw a statement to the effect of “Art exists to produce feelings in us that we want, but do not get enough of in the course of normal life.” That’s what makes art valuable – supplementing emotional malnutrition. Such a thing exists because “to feel emotions” is the core function of humanity, and not fulfilling that function hurts like not eating does.
This is why (for many people) the optimal level of psychosis is non-zero. This is why intelligence is important – a greater level of intelligence allows a species to experience far more complex and nuanced emotional states. And the ability to experience more varieties of emotions is why it’s better to become more complex rather than simply dialing up happiness. It’s why disorders that prevent us from experiencing certain emotions are so awful (with the worst obviously being the ones that prevent us from feeling the “best” desires)
It’s why we like funny things, and tragic things, and scary things. Who wants to feel the way they feel after watching all of Evangelion?? Turns out – everyone, at some point, for at least a little bit of time!
It is why all human life has value. You do not matter based on what you can produce, or how smart you are, or how useful you are to others. You matter because you are a human who feels things.
My utility function is to feel a certain elastic web of emotions, and it varies from other utility functions by which emotions are desired in which amounts. My personality determines what actions produce what emotions.
And a machine that could feel things even better than humans can could be a wonderful thing. Greg Egan’s "Diaspora" features an entire society of uploaded humans, living rich, complex lives of substance. Loving, striving, crying, etc. The society can support far more humans than is physically possible in meat-bodies, running far faster than is possible in realspace. Since all these humans are running on computer chips, one could argue that one way of looking at this thing is not “A society of uploaded humans” but “A machine that feels human emotions better than meat-humans do.” And it’s a glorious thing. I would be happy to live in such a society.
II. God Mode is Super Lame
Why not just wirehead with a large and complex set of emotions?
I’m old enough to have played the original Doom when it came out (sooo old!). It had a cheat-code that made you invincible, commonly called god-mode. The first thing you notice is that it’s super cool to be invincible and just mow down all those monsters with impunity! The next thing you notice is that after a while (maybe ten minutes?) it loses all appeal. It becomes boring. There is no game anymore, once you no longer have to worry about taking damage. It becomes a task. You start enabling other cheats to get through it faster. Full-ammo cheats, to just use the biggest, fastest gun nonstop and get those monsters out of your way. Then walk-through-wall cheats, so you can just go straight to the level exit without wandering around looking for keys. Over, and over, and over again, level after level. It becomes a Kafka-esque grotesquery. Why am I doing this? Why am I here? Is my purpose just to keep walking endlessly from Spawn Point to Exit, the world passing around me in a blur, green and blue explosions obscuring all vision? When will this end?
It was a relief to be finished with the game.
That was my generation’s first brush with the difference between goal-oriented objectives, and process-oriented objectives. We learned that the point of a game isn’t to get to the end, the point is to play the game. It used to be that if you wanted to be an awesome guitarist, you had to go through the process of playing guitar a LOT. There was no shortcut. So one could be excused for confusing “I want to be a rock star” with “I want to be playing awesome music.” Before cheat codes, getting to the end of the game was fun, so we thought that was our objective. After cheat-codes we could go straight to the end any time we wanted, and now we had to choose – is your objective really just to get to the end? Or is it to go through the process of playing the game?
Some things are goal-oriented, of course. Very few people clean their toilets because they enjoy the process of cleaning their toilet. They want their toilet to be clean. If they could push a button and have a clean toilet without having to do the cleaning, they would.
Process-oriented objectives still have a goal. You want to beat the game. But you do not want first-order control over the bit “Game Won? Y/N”. You want first-order control over the actions that can get you there – strafing, shooting, jumping – resulting in second-order control over if the bit finally gets flipped or not.
First-order control is god mode. Your goal is completed with full efficiency. Second-order control is indirect. You can take actions, and those actions will, if executed well, get you closer to your goal. They are fuzzier, you can be wrong about their effects, their effects can be inconsistent over time, and you can get better at using them. You can tell if you’d prefer god-mode for a task by considering if you’d like to have it completed without going through the steps.
Do you want to:
Have Not Played The Game, And Have It Completed? or Be Playing The Game?
Have A Clean Toilet, Without Cleaning It Yourself? or Be Cleaning The Toilet?
Be At The End of a Movie? or Be Watching The Movie?
If the answer is in the first column, you want first-order control. If it is in the second column, you want second-order control.
Wireheading, even variable multi-emotional wireheading, assumes that emotions are a goal-oriented objective, and thus takes first-order control of one’s emotional state. I contest that emotions are a process-oriented objective. The purpose is to evoke those emotions by using second-order control – taking actions that will lead to those emotions being felt. To eliminate that step and go straight to the credits is to lose the whole point of being human.
III. Removing The Person From The Output
How is the process of playing Doom without cheat codes distinguished from the process of repeatedly pushing a button connected to certain electrodes in your head that produce the emotions associated with playing Doom without cheat codes? (Or just lying there while the computer chooses which electrodes to stimulate on your behalf?)
If it’s just the emotions without the experiences that would cause those emotions, I think that’s a huge difference. That is once again just jumping right to the end-state, rather than experiencing the process that brings it about. It’s first-order control, and that efficiency and directness strips out all the complexity and nuance of a second-order experience.
See Incoming Fireball -> Startled, Fear
Strafe Right -> Anticipation, Dread
Fireball Dodged -> Relief
Return Fire -> Vengeance!!
Is strictly more complicated than just
The key difference being that in the first case, the player is entangled in the process. While these things are designed to produce a specific and very similar experiences for everyone (which is why they’re popular to a wide player base), it takes a pre-existing person and combines them with a series of elements that is supposed to lead to an emotional response. The exact situation is unique(ish) for each person, because the person is a vital input. The output (of person feeling X emotions) is unique and personalized, as the input is different in every case.
When simply conjuring the emotions directly via wire, the individual is removed as an input. The emotions are implanted directly and do not depend on the person. The output (of person feeling X emotions) is identical and of far less complexity and value. Even if the emotions are hooked up to a random number generator or in some other way made to result in non-identical outputs, the situation is not improved. Because the problem isn’t so much “identical output” as it is that the Person was not an input, was not entangled in the process, and therefore doesn’t matter.
I actually don’t have much of a problem with simulated-realities. Already a large percentage of the emotions felt by middle-class people in the first world are due to simulated realities. We induce feelings via music, television/movies, video games, novels, and other art. I think this has had some positive effects on society – it’s nice when people can get their Thrill needs met without actually risking their lives and/or committing crimes. In fact, the sorts of people who still try to get all their emotional needs met in the real world tend to be destructive and dramatic and I’m sure everyone knows at least one person like that, and tries to avoid them.
I think a complete retreat to isolation would be sad, because other human minds are the most complex things that exist, and to cut that out of one’s life entirely would be an impoverishment. But a community of people interacting in a cyberworld, with access to physical reality? Shit, that sounds amazing!
Of course a “Total Recall” style system has the potential to become nightmarish. Right now when someone watches a movie, they bring their whole life with them. The movie is interpreted in light of one’s life experience. Every viewer has a different experience (some people have radically different experiences, as me and my SO recently discovered when we watched Birdman together. In fact, this comparing of the difference of experiences is the most fun part of my bi-weekly book club meetings. It’s kinda the whole point.). The person is an input in the process, and they’re mashed up into the product. If your proposed system would simply impose a memory or an experience onto someone else wholesale* without them being involved in the process, then it would be just as bad as the “series of emotions” process.
I have a vision of billions of people spending all of eternity simply reliving the most intense emotional experiences ever recorded, in perfect carbon copy, over and over again, and I shudder in horror. That’s not even being a person anymore. That’s overwriting your own existence with the recorded existence of someone(s) else. :(
But a good piece of art, that respects the person-as-input, and uses the artwork to cause them to create/feel more of their own emotions? That seems like a good thing.
(*this was adapted from a series of posts on my blog)
Has no one else mentioned this on LW yet?
Elizabeth Edwards has been elected as a New Hampshire State Rep, self-identifies as a Rationalist and explicitly mentions Less Wrong in her first post-election blog post.
Sorry if this is a repost
View more: Next