You're definitely missing the point of the whole thing. Suppose that the optimal design for gaining knowledge is something like this (a vast supercomputer without the slightest bit of awareness or emotion.)
I think it is very unlikely- even in the worst case scenarios, I can't imagine that superintelligence wouldn't inherit some sort of value.
I don't see the problem with that being the eventual case. Death of the state of the world as we know it yes; but the existence of a new entity. That's the way the cookie crubles.
If it's worth saying, but not worth its own post (even in Discussion), then it goes here.
Notes for future OT posters:
1. Please add the 'open_thread' tag.
2. Check if there is an active Open Thread before posting a new one. (Immediately before; refresh the list-of-threads page before posting.)
3. Open Threads should be posted in Discussion, and not Main.
4. Open Threads should start on Monday, and end on Sunday.