I can conceive of the following 3 main types of meaning we can pursue in life.
1. Exploring existing complexity: the natural complexity of the universe, or complexities that others created for us to explore.
2. Creating new complexity for others and ourselves to explore.
3. Hedonic pleasure: more or less direct stimulation of our pleasure centers, with wire-heading as the ultimate form.
What I'm observing in the various FAI debates is a tendency of people to shy away from wire-heading as something the FAI should do. This reluctance is generally not substantiated or clarified with anything other than "clearly, this isn't what we want". This is not, however, clear to me at all.
The utility we get from exploration and creation is an enjoyable mental process that comes with these activities. Once an FAI can rewire our brains at will, we do not need to perform actual exploration or creation to experience this enjoyment. Instead, the enjoyment we get from exploration and creation becomes just another form of pleasure that can be stimulated directly.
If you are a utilitarian, and you believe in shut-up-and-multiply, then the correct thing for the FAI to do is to use up all available resources so as to maximize the number of beings, and then induce a state of permanent and ultimate enjoyment in every one of them. This enjoyment could be of any type - it could be explorative or creative or hedonic enjoyment as we know it. The most energy efficient way to create any kind of enjoyment, however, is to stimulate the brain-equivalent directly. Therefore, the greatest utility will be achieved by wire-heading. Everything else falls short of that.
What I don't quite understand is why everyone thinks that this would be such a horrible outcome. As far as I can tell, these seem to be cached emotions that are suitable for our world, but not for the world of FAI. In our world, we truly do need to constantly explore and create, or else we will suffer the consequences of not mastering our environment. In a world where FAI exists, there is no longer a point, nor even a possibility, of mastering our environment. The FAI masters our environment for us, and there is no longer a reason to avoid hedonic pleasure. It is no longer a trap.
Since the FAI can sustain us in safety until the universe goes poof, there is no reason for everyone not to experience ultimate enjoyment in the meanwhile. In fact, I can hardly tell this apart from the concept of a Christian Heaven, which appears to be a place where Christians very much want to get.
If you don't want to be "reduced" to an eternal state of bliss, that's tough luck. The alternative would be for the FAI to create an environment for you to play in, consuming precious resources that could sustain more creatures in a permanently blissful state. But don't worry; you won't need to feel bad for long. The FAI can simply modify your preferences so you want an eternally blissful state.
Welcome to Heaven.
There isn't an "original". After the copying, there's Copy A and Copy B. Both are me. I'm fine with randomly selecting whether Copy A or Copy B goes to see the movie, but it doesn't matter, since they're identical (until one sees the movie). In fact, there is no way to not randomly select which copy sees the movie.
From the point of view of the clone who sees the movie (say it's bad), "suiciding" is the same as him going back in time and not seeing the movie. So I'd always stick to a prior agreement in a case like that.
I don't really have any wealth to speak of. But they're all me. If I won't defect, then they won't. The question is just whether or not we might disagree on what's best for me. In which case, we can either go by prior agreement, or just let them all live. If the other mes really wanted to live, I'd let them. For instance, say I made 5 copies and all 5 of us went out to try different approaches to a career, agreeing the best one would survive. If a year later more than one claimed to have the best result for Blueberry, I might as well let more than one live.
ETA: However, there might be situations where I can only have one copy survive. For instance, I'm in a grad program now that I'd like to finish, and more than one of me can't be enrolled for administrative reasons. So if I really need only one of me, I guess we could decide randomly which one would survive. I'm all right with forcing a copy to suicide if he changes his mind, since I'm making that decision for all the clones ahead of time to lead to the best outcome for Blueberry.
Response to ETA:
If one of the clones developed enough individuality to change his mind and disagree with the others, I definitely don't see how you could consider that one anything other than an individual.
Likewise, if all of the clones decided to change their minds and go their separate ways, that would be functionally the same as you-as-a-single-person-with-a-single-body changing your mind about something, and the general rule there is that humans are allowed to do that, without being interfered with. I don't see any reason to change that rule.