Eliezer Y, along with I’m guessing a lot of people in the rationalist community, seems to be essentially a kind of Humean about morality (more specifically, a Humean consequentialist). Now, Humean views of morality are essentially extremely compatible with a very broad statement of the Orthogonality Thesis, applied to all rational entities.
Humean views about morality, though, are somewhat controversial. Plenty of people think we can rationally derive moral laws (Kantians); plenty of people think there are certain objective ends to human life (virtue ethici...
This is basically just a more explicitly AGI-related version of the Fermi Paradox but:
1.If AGI is created, it is obviously very unlikely that we are the first in the universe to create it, and it is likely that it was already created a long time ago.
2.If AGI is created, aligned or unaligned, there seems to be consensus that some kind of ongoing, widespread galactic conquest/control would end up constituting an instrumental goal of the AGI.
3. If AGI is created, there seem to be consensus that its capabilities would be so great as to enable widespread galact...
I’m going to need an entire separate book, which is mostly just John Ray quotes.