Yes, I'm not claiming anything new here.
There would be selection pressures for ems as well, in fact they would be stronger than for present- day people. Someone would need to create the ems and they would probably prefer ems with the psychological traits required to be efficient workers.
Future people will probably experience a somewhat balanced mix of good and bad feelings, just as we do. If they were either always happy or always unhappy, they would probably be less effective at working, surviving or reproducing.
If conditions in the future are such that modern humans would be very unhappy (or very happy) we will change to become more so, or less.
I am referring to a copy that contains exactly the same information as the current "me".
This doesn't seem particularly useful to me. Even if the written copy could be identical to me in every way, I would place a much lower value on the creation of such a copy than on the extension of my current life. You're right that this might be slightly preferable to death, but I certainly wouldn't position it as a real alternative even to cryonics.
That's a really good analysis of the problems with MORSucks. Unfortunately, people who only slightly dislike a work, or acknowledge that has some flaws but enjoy it anyway, seldom form blogs devoted to deconstructing it. In general, you have to choose between overwhelming praise and overwhelming hate.
I strongly disagree that Will McAskill or EA generally are responsible for "misaligning" Sam Bankman-Fried. I don't see much evidence that either effective altruism or "Effective Altruism" did much to cause his life to play out as it did.
Sam is, as you see, someone who is able and willing to lie and mislead others. You should approach his comments regarding his motivations with the same skepticism you apply to the other things he says.