Complex challenges? Novelty? Individualism? Self-awareness? Experienced happiness? A paperclip maximizer cares not about these things.
But advanced evolved organisms probably will.
The paper-clipper is a straw man that is only relevant if some well-meaning person tries to replace evolution with their own optimization or control system. (It may also be relevant in the case of a singleton; but it would be non-trivial to demonstrate that.)
Subscribe to RSS Feed
= f037147d6e6c911a85753b9abdedda8d)
Or perhaps someone else will at least explain what "having more shaping influence than a simple binary filter on utility functions" means. It sounds like it's supposed to mean that all evolution can do is eliminate some utility functions. If that's what it means, I don't see how it's relevant.