What is the reason you don't want him to swear? Maybe you could tell him that.
A few thoughts:
It sounds like he's being rebellious. Separate the rebelliousness from the question of profanity, and discuss them separately. You might say something like "Asking questions if you genuinely want to understand something better is great, but asking questions to try to frustrate or annoy me is not. I'm getting the sense that you're doing the latter."
If he persists, put your foot down -- but be really clear that it's for the intent to annoy you, rather than because he's asking questions in an attempt to honestly understand something.
It may also help to ask him, openly and gently, why he's being rebellious. Sometimes rebelliousness comes from a perception that the rules are arbitrary or unfair. If you understand what he's feeling, you can be in a better position to address those underlying causes. For example -- Larks's suggestion that sharing the reasons you don't want him to swear may help. And maybe it would also help him to explain how this is a subjective issue, highly dependent on things like tone and social context, and perfectly clear rules are unfortunately impossible.
I'm not saying its inevitable, but it's a failure of imagination if you can't think of any way the future can go horribly wrong like that.
My biggest concern is an AI or civilization that decides to create a real hell to punish people for their sins. Humans have pretty strong feelings towards wanting to punish those who did wrong, and our morality and views on punishment are constantly changing.
E.g. if slaveholder a were alive today, some people may want to see them tortured. In the future perhaps they will want to punish, hypothetically, meat eaters or people who weren't as altruistic as possible, or something we can't even conceive of.
Yeah, there are plenty of examples of dictators that go through great lengths to inflict tremendous amounts of pain on many people. It's terrifying to think of someone like that in control of an AGI.
Granted, people like that probably tend to be less likely than the average head-of-state to find themselves in control of an AGI, since brutal dictators often have unhealthy economies, and are therefore unlikely to win an AGI race. But it's not like they have a monopoly on revenge or psychopathy either.
1) Do you think that an opt-out clause is a useful-in-principle way to address your concerns?
Yes. In principle, you should better achieve your outcomes if you have more precise instructions. In principle, this seems sort of obvious.
In practice, I see two problems:
1) If the agency misinterprets your instructions.
2) Ambiguous instructions could facilitate corruption. In the same way that ambiguous laws do.
I'm not sure whether the upsides (more precise instructions allow you to better achieve your outcome) outweigh the downsides (1 and 2). My intuition is that I'm ~65% sure that the upsides outweigh the downsides, but that doesn't reflect much thought.
Practical problem #3: The agency successfully understands your intentions, and is willing to implement them, but not able to implement them.
For example, a fast intelligence explosion removes their capability of doing so before they can pull the plug. Or a change in their legal environment makes it illegal for them to pull the plug (and they aren't willing to put themselves at legal risk to do so).
Over the past few years, a few people have claimed rejection of cryonics due to concerns that they might be revived into a world that they preferred less than being dead or not existing.
Uh, plenty of born are born into worse-than-death situations already, at least by our standards, yet they generally make a go of their lives instead of committing suicide. We call many of them our "ancestors."
I get a chuckle out of all the contrived excuses people come up with for not having their brains preserved. I really have to laugh at "But I won't know anyone in Future World!" We go through our lives meeting people every day we've never met before, and humans have good heuristics for deciding which strangers we should get to know better and add to our social circles. I had that experience the other day from meeting a married couple involved in anti-aging research, and I got the sense that they felt that way about me, despite my social inadequacies in some areas.
As for revival in a sucky Future World, well, John Milton said it pretty good:
"The mind is its own place, and in it self/ Can make a Heav'n of Hell, a Hell of Heav'n "
Besides, if you have radical life extension and some freedom of action, you'll have the time and resources to find situations more to your liking. For example, suppose you wake up in Neoreactionary Future World, and you long for the Enlightenment sort of world you remembered in the 21st Century. Well, find your place in the current hierarchy and wait a few centuries. The Enlightenment might come around for a second go.
Uh, plenty of born are born into worse-than-death situations already, at least by our standards, yet they generally make a go of their lives instead of committing suicide. We call many of them our "ancestors."
Can you elaborate? Your statement seems self-contradictory. By definition, situations "worse than death" would be the ones in which people prefer to kill themselves rather than continue living.
In the context of the original post, I take "worse-than-death" to mean (1) enough misery that a typical person would rather not continue living, and (2) an inability to commit suicide. While I agree many of our ancestors have had a rough time, relatively few of them have had it that hard.
Easy, if you are worried about worse-than-death life after revival, don't get preserved. It's not like there are too few people in the world and no way to create more. I'll take my chances, if I can. I don't expect it to be a problem to self-terminate later, should I want to. I don't put any stock in the scary scenarios where an evil Omega tortures a gazillion of my revived clones for eternity.
I don't put any stock in the scary scenarios where an evil Omega tortures a gazillion of my revived clones for eternity.
Could you elaborate on this? I'd be curious to hear your reasoning.
Does "don't put any stock" mean P(x) = 0? 0.01? 1e-10?
Subscribe to RSS Feed
= f037147d6e6c911a85753b9abdedda8d)
Telling someone who tries to wage status conflicts with you that you want him to stop fighting for more status is pretty pointless.
I'm not sure "status conflict" is the only possibility here; for example, the terminal value might be something like autonomy, or feeling genuinely listened to.