jacob_cannell comments on David Chalmers' "The Singularity: A Philosophical Analysis" - Less Wrong
You are viewing a comment permalink. View the original post to see all comments and the full post content.
You are viewing a comment permalink. View the original post to see all comments and the full post content.
Comments (202)
Are there any LW-rationalist-vetted philosophical papers on this theme in modern times? (I'm somewhat skeptical of the idea that there isn't a universal morality (relative to some generalized Occamian prior-like-thing) that even a paperclip maximizer would converge to (if it was given the right decision theoretic (not necessarily moral per se) tools for philosophical reasoning, which is by no means guaranteed, so we should of course still be careful when designing AGIs).)
Any universal morality has to have long term fitness - ie it must somehow win at the end of time.
Otherwise, aliens may have a more universal morality.
EDIT: why the downvote?
This does not require as much optimization as it sounds. As Wei Dai points out, computing power is proportional to the square amount of mass obtained as long as that mass can be physically collected together, so a civilization collecting mass probably gets more observers than one spreading out and colonizing mass, depending on the specifics of cosmology. This kind of civilization is much easier to control centrally, so a wide range of values have the potential to dominate, depending on which ones happen to come into being.
I'm not sure where he got the math that available energy is proportional to the square of the mass. Wouldn't this come from the mass-energy equivalence and thus be mc^2?
Wei Dai's conjecture about black holes being useful as improved entropy dumps is interesting. Black holes or similar dense entities also maximize speed potential and interconnect efficiency, but they are poor as information storage.
It's also possible that by the time a civilization reaches this point of development, it figures out how to do something more interesting such as create new physical universes. John Smart has some interesting speculation on that and how singularity civilizations may eventually compete/cooperate.
I still have issues wrapping my head around the time dilation.
Energy is proportional to mass. Computing ability is proportional to (max entropy - current entropy), and max entropy is proportional to the square of mass. That was the whole point of his argument.