Posts

Sorted by New

Wiki Contributions

Comments

Sorted by
Clay_H.00

While we're on the subject of the Golden Ecumene, I've only read the first book but the major problem I had with it was the egregious violation of your Rule 31 in its twist ending (which frankly seemed tacked-on, as if the publisher had said "Do you think you could turn this book into a trilogy somehow?").

Clay_H.60

Your deontological ethics are tiresome. Why not just be a utilitarian and lie your way to a better tomorrow?

Put more seriously, I would think that being believed to put the welfare of humanity ahead of concerns about personal integrity could have significant advantages itself.

Or put another way, when it's time to shut up and do the impossible (save humanity, say), that doesn't seem like a good time to defer to pre-established theories, of ethics or anything else. Refer, yes; defer, no. You say to beware of cleverness, be wary of thinking you're smarter than your ethics (meaning deontological beliefs and intuitions). That discussion sounded like a Hofstadter's Law ("It always takes longer than you expect, even when you take Hofstadter's Law into account.") for ethics. Yet, when the chips are down, when you've debugged your hardware as best you can, isn't our cleverness what we have to trust? What else could there be? After all, as you yourself said, rationality is ultimately about winning, and so however much you hedge against black swans and corrupt hardware, it can't be an infinite amount, and there must come a point where you should stop and do what your brain computes is the right thing to do.

If my ethics don't tell me to save the world, I have no ethics at all.