There is something I am missing here.
Get rid of enough constraints, and you’ll get the equivalent of a Spiegelman’s monster, no longer even remotely human.
And this is bad how?
Human value is definitely the something to protect, and business as usual will destroy us.
What do you mean by "destroy us"? Change 21-century human animals into something better adapted to survive in the new Universe?
EDIT: I guess I should articulate my confusion better: what's wrong with gradually becoming an Egan's jewelhead (sounds like an equivalent of uploading to me) or growing an earring-based prosthetic neocortex?
I guess I should articulate my confusion better: what's wrong with gradually becoming an Egan's jewelhead (sounds like an equivalent of uploading to me) or growing an earring-based prosthetic neocortex?
I don't think those outcomes would be particularly bad: they're still keeping most constraints in place. If all that remained of humanity were replicators who only cared about making more copies of themselves and might not even be conscious, now that sounds much worse.
Related to: Kaj Sotala's Posts, Blogs by LWers
By fellow LessWronger Kaj_Sotala on his blog.