True, but it would have to be an exceedingly black swan to result in evolutionary-like mutations rather than simple annihilation.
First, annihilation is good enough -- a destroyed nanobot fails at making "error-free clones of itself until the heat death of the universe".
Second, all you need to do is to screw up the error-correction mechanism, the rest will take care of itself naturally.
Another month has passed and here is a new rationality quotes thread. The usual rules are: