I can conceive of the following 3 main types of meaning we can pursue in life.
1. Exploring existing complexity: the natural complexity of the universe, or complexities that others created for us to explore.
2. Creating new complexity for others and ourselves to explore.
3. Hedonic pleasure: more or less direct stimulation of our pleasure centers, with wire-heading as the ultimate form.
What I'm observing in the various FAI debates is a tendency of people to shy away from wire-heading as something the FAI should do. This reluctance is generally not substantiated or clarified with anything other than "clearly, this isn't what we want". This is not, however, clear to me at all.
The utility we get from exploration and creation is an enjoyable mental process that comes with these activities. Once an FAI can rewire our brains at will, we do not need to perform actual exploration or creation to experience this enjoyment. Instead, the enjoyment we get from exploration and creation becomes just another form of pleasure that can be stimulated directly.
If you are a utilitarian, and you believe in shut-up-and-multiply, then the correct thing for the FAI to do is to use up all available resources so as to maximize the number of beings, and then induce a state of permanent and ultimate enjoyment in every one of them. This enjoyment could be of any type - it could be explorative or creative or hedonic enjoyment as we know it. The most energy efficient way to create any kind of enjoyment, however, is to stimulate the brain-equivalent directly. Therefore, the greatest utility will be achieved by wire-heading. Everything else falls short of that.
What I don't quite understand is why everyone thinks that this would be such a horrible outcome. As far as I can tell, these seem to be cached emotions that are suitable for our world, but not for the world of FAI. In our world, we truly do need to constantly explore and create, or else we will suffer the consequences of not mastering our environment. In a world where FAI exists, there is no longer a point, nor even a possibility, of mastering our environment. The FAI masters our environment for us, and there is no longer a reason to avoid hedonic pleasure. It is no longer a trap.
Since the FAI can sustain us in safety until the universe goes poof, there is no reason for everyone not to experience ultimate enjoyment in the meanwhile. In fact, I can hardly tell this apart from the concept of a Christian Heaven, which appears to be a place where Christians very much want to get.
If you don't want to be "reduced" to an eternal state of bliss, that's tough luck. The alternative would be for the FAI to create an environment for you to play in, consuming precious resources that could sustain more creatures in a permanently blissful state. But don't worry; you won't need to feel bad for long. The FAI can simply modify your preferences so you want an eternally blissful state.
Welcome to Heaven.
I may be answering an un-asked question, since I haven't been following this conversation, but the following solution to the issue of clones occurs to me:
Leave it up to the clone.
Make suicide fully legal and easily available (possibly 'suicide of any copy of a person in cases where more than one copy exists', though that could allow twins greater leeway depending on how you define 'person' - perhaps also add a time limit: the split must have occurred within N years). When a clone is created, it's automatically given the rights to 1/2 of the original's wealth. If the clone suicides, the original 'inherits' the wealth back. If the clone decides not to suicide, it automatically keeps the wealth that it has the rights to.
Given that a clone is functionally the same person as the original, this should be an ethical solution (assuming that you consider suicide ethical at all) - someone would have to be very sure that they'd be able to go through with suicide, or very comfortable with the idea of splitting their wealth in half, in order to be willing to take the risk of creating a clone. The only problem that I see is with unsplittable things like careers and relationships. (Flip a coin? Let the other people involved decide?)
This seems like a good solution. If I cloned myself, I'd want it to be established beforehand which copy would stay around, and which copy would go away. For instance, if you're going to make a copy that goes to watch a movie to see if the movie is worth your time, the copy that watches the movie should go away, because if it's good the surviving version of yourself will watch it anyway.
I (and thus my clones) don't see it as suicide, more like a... (read more)