In this context, what does "more altruistic" (as in 1) mean? Does it mean that you want to change your beliefs about what is right to do, or that given your current beliefs, you want to give more to charity (but, for whatever reason, find it difficult to do so)? If it's the former, it seems contradictory - it's saying "it's right for me to be more altruistic than I currently am, but I don't believe it". If it's the latter, the transition between 1 and 2 wouldn't happen, because your belief about the optimum level of altruism either wouldn't change (if you are currently correct about what the optimal amount of altruism is for you) or it would change in a way that would be appropriate based on new information (maybe giving more to charity is easier once you get started). I can see your estimation of your optimum level of altruism changing based on new information, but I don't see how it would lead to a transition such as that between 1 and 2. Even if charity is very easy and very enjoyable, it doesn't follow that you should value all humans equally.
My main objection to Coherent Extrapolated Volition (CEV) is the "Extrapolated" part. I don't see any reason to trust the extrapolated volition of humanity - but this isn't just for self centred reasons. I don't see any reason to trust my own extrapolated volition. I think it's perfectly possible that my extrapolated volition would follow some scenario like this:
There are many other ways this could go, maybe ending up as a negative utilitarian or completely indifferent, but that's enough to give the flavour. You might trust the person you want to be, to do the right things. But you can't trust them to want to be the right person - especially several levels in (compare with the argument in this post, and my very old chaining god idea). I'm not claiming that such a value drift is inevitable, just that it's possible - and so I'd want my initial values to dominate when there is a large conflict.
Nor do I give Armstrong 7's values any credit for having originated from mine. Under torture, I'm pretty sure I could be made to accept any system of values whatsoever; there are other ways that would provably alter my values, so I don't see any reason to privilege Armstrong 7's values in this way.
"But," says the objecting strawman, "this is completely different! Armstrong 7's values are the ones that you would reach by following the path you would want to follow anyway! That's where you would get to, if you started out wanting to be more altruistic, had control over you own motivational structure, and grew and learnt and knew more!"
"Thanks for pointing that out," I respond, "now that I know where that ends up, I must make sure to change the path I would want to follow! I'm not sure whether I shouldn't be more altruistic, or avoid touching my motivational structure, or not want to grow or learn or know more. Those all sound pretty good, but if they end up at Armstrong 7, something's going to have to give."