After reading lots of debates on these topics, I'm no longer sure what the terms mean. Is a paperclip maximizer a "moral nihilist"? If yes, then so am I. Same for no.
a paperclip maximizer is something that, as you know, requires incredible amount of work put into defining what a paperclip is (if that is even possible without fixing a model). It subsequently has an incredibly complex moral system, very stupid one, but incredibly complex nonetheless. Try something like equation-solver.
Do you believe in an objective morality capable of being scientifically investigated (a la Sam Harris *or others*), or are you a moral nihilist/relativist? There seems to be some division on this point. I would have thought Less Wrong to be well in the former camp.
Edit: There seems to be some confusion - when I say "an objective morality capable of being scientifically investigated (a la Sam Harris *or others*)" - I do NOT mean something like a "one true, universal, metaphysical morality for all mind-designs" like the Socratic/Platonic Form of Good or any such nonsense. I just mean something in reality that's mind-independent - in the sense that it is hard-wired, e.g. by evolution, and thus independent/prior to any later knowledge or cognitive content - and thus can be investigated scientifically. It is a definite "is" from which we can make true "ought" statements relative to that "is". See drethelin's comment and my analysis of Clippy.