Like any educated denizen of the 21st century, you may have heard of World War II. You may remember that Hitler and the Nazis planned to carry forward a romanticized process of evolution, to breed a new master race, supermen, stronger and smarter than anything that had existed before.
Actually this is a common misconception. Hitler believed that the Aryan superman had previously existed—the Nordic stereotype, the blond blue-eyed beast of prey—but had been polluted by mingling with impure races. There had been a racial Fall from Grace.
It says something about the degree to which the concept of progress permeates Western civilization, that the one is told about Nazi eugenics and hears "They tried to breed a superhuman." You, dear reader—if you failed hard enough to endorse coercive eugenics, you would try to create a superhuman. Because you locate your ideals in your future, not in your past. Because you are creative. The thought of breeding back to some Nordic archetype from a thousand years earlier would not even occur to you as a possibility—what, just the Vikings? That's all? If you failed hard enough to kill, you would damn well try to reach heights never before reached, or what a waste it would all be, eh? Well, that's one reason you're not a Nazi, dear reader.
It says something about how difficult it is for the relatively healthy to envision themselves in the shoes of the relatively sick, that we are told of the Nazis, and distort the tale to make them defective transhumanists.
It's the Communists who were the defective transhumanists. "New Soviet Man" and all that. The Nazis were quite definitely the bioconservatives of the tale.
Agreed that apparent contradictions are often interesting areas of inquiry.
The only ways I know of to sidestep having to decide which norms best align with my values is to adopt values such that either no community's norms are superior to any others', or such that whatever norms happen to emerge victorious from the interaction of social groups are superior to all the norms they displace. Neither of those tempt me at all, though I know people who endorse both.
If I reject both of those options, I'm left with the possibility that two communities C1 and C2 might exist such that C1's norms are superior to C2's, but the interaction of C1 and C2 results in C1's norms being displaced by C2's.
I don't see a fourth option. Do you?
For example... you say I have a moral responsibility to seek truth, which suggests that if I'm in a community whose values oppose truthseeking in certain areas, I have a moral responsibility to violate my community's norms. No?
This has interesting parallels to the Friendly AI problem. For example, one could posit that material wealth might somehow be a suitable arbiter, but I can imagine plenty of situations where C2 displaces C1 (Corporate lobbying?) followed by global ecological catastrophes. Here, dollars take the place of smiley faces strewn across the solar system. Maybe the problem of a sustainably benevolent truth-seeking group is somehow the same problem as FAI on some level?