It's not clear to me what you mean by value. To say that something has value is to say that is more valuable than other things. This is why at the end of your progression valuing everything becomes equivalent to valuing nothing.
Yes, that's another attractor, to my mind. Stuart 7 doesn't value everything, though; he values objects/beings, and dislikes the destruction of these. That's why he still has preferences.
But the example was purely illustrative of the general idea.
I'm still not clear what constitutes an object/being and what does not. Is a proton an object?
Fundamentally I think you're having an understandably difficult applying a binary classification system (value/not value) to a real continuous system. The continuity of value, where things are valuable based on their degree of sentience, or degree of life which I outlined above resolves this to some extent.
I still don't see how this is fundamentally about altruism. Altruism, loosely defined, is a value system that does not privilege the self over similar beings,...
My main objection to Coherent Extrapolated Volition (CEV) is the "Extrapolated" part. I don't see any reason to trust the extrapolated volition of humanity - but this isn't just for self centred reasons. I don't see any reason to trust my own extrapolated volition. I think it's perfectly possible that my extrapolated volition would follow some scenario like this:
There are many other ways this could go, maybe ending up as a negative utilitarian or completely indifferent, but that's enough to give the flavour. You might trust the person you want to be, to do the right things. But you can't trust them to want to be the right person - especially several levels in (compare with the argument in this post, and my very old chaining god idea). I'm not claiming that such a value drift is inevitable, just that it's possible - and so I'd want my initial values to dominate when there is a large conflict.
Nor do I give Armstrong 7's values any credit for having originated from mine. Under torture, I'm pretty sure I could be made to accept any system of values whatsoever; there are other ways that would provably alter my values, so I don't see any reason to privilege Armstrong 7's values in this way.
"But," says the objecting strawman, "this is completely different! Armstrong 7's values are the ones that you would reach by following the path you would want to follow anyway! That's where you would get to, if you started out wanting to be more altruistic, had control over you own motivational structure, and grew and learnt and knew more!"
"Thanks for pointing that out," I respond, "now that I know where that ends up, I must make sure to change the path I would want to follow! I'm not sure whether I shouldn't be more altruistic, or avoid touching my motivational structure, or not want to grow or learn or know more. Those all sound pretty good, but if they end up at Armstrong 7, something's going to have to give."