Hollerith, you are now officially as weird as a Yudkowskian alien. If I ever write this species I'll name it after you.
Eliezer, to which of the following possibilities would you accord significant probability mass? (1) Richard Hollerith would change his stated preferences if he knew more and thought faster, for all reasonable meanings of "knew more and thought faster"; (2) There's a reasonable notion of extrapolation under which all normal humans would agree with a goal in the vicinity of Richard Hollerith's stated goal; (3) There exist relat...
Phil, your analysis depends a lot on what the probabilities are without Eliezer.
If Eliezer vanished, what probabilities would you assign to: (A) someone creating a singularity that removes most/all value from this part of the universe; (B) someone creating a positive singularity; (C) something else (e.g., humanity staying around indefinitely without a technological singularity)? Why?
Unknown, how certain are you that you would retain that preference if you "knew more, thought faster"? How certain are you that Eliezer would retain the opposite preference and that we are looking at real divergence? I have little faith in my initial impressions concerning Babyeaters vs. black holes; it's hard for me to understand the Babyeater suffering, or the richness of their lives vs. of black holes, as more than a statistic.
Eliezer, regarding (2), it seems plausible to me (I'd assign perhaps 10% probability mass) that if there is a well... (read more)