This article is a deliberate meta-troll. To be successful I need your trolling cooperation. Now hear me out.
In The Strangest Thing An AI Could Tell You Eliezer talks about asognostics, who have one of their arm paralyzed, and what's most interesting are in absolute denial of this - in spite of overwhelming evidence that their arm is paralyzed they will just come with new and new rationalizations proving it's not.
Doesn't it sound like someone else we know? Yes, religious people! In spite of heaps of empirical evidence against existence of their particular flavour of the supernatural, internal inconsistency of their beliefs, and perfectly plausible alternative explanations being well known, something between 90% and 98% of humans believe in the supernatural world, and is in a state of absolute denial not too dissimilar to one of asognostics. Perhaps as many as billions of people in history have even been willing to die for their absurd beliefs.
We are mostly atheists here - we happen not to share this particular delusion. But please consider an outside view for a moment - how likely is it that unlike almost everyone else we don't have any other such delusions, for which we're in absolute denial of truth in spite of mounting heaps of evidence?
If the delusion is of the kind that all of us share it, we won't be able to find it without building an AI. We might have some of those - it's not too unlikely as we're a small and self-selected group.
What I want you to do is try to trigger absolute denial macro in your fellow rationalists! Is there anything that you consider proven beyond any possibility of doubt by both empirical evidence and pure logic, and yet saying it triggers automatic stream of rationalizations in other people? Yes, I pretty much ask you to troll, but it's a good kind of trolling, and I cannot think of any other way to find our delusions.
...Thought crime? Really? That's what you get from me saying that it's unethical to think of people as suitable objects of manipulation? Yes, I used the word "think", but the emphasis was really on "suitable". I could have used the phrasing "it's inappropriate to be disposed to manipulate people", or "the opinion that people are suitable targets of manipulation will tend to lead to manipulation, which is wrong" or "the ethically relevant belief that people are suitable targets of manipulation is false", or "to speak of people as suitable objects of manipulation reflects an ethically abhorrent facet of the speaker's personality" - and meant more or less the same thing. Is that clearer?
I'd phrase it a little bit differently, but overall, yeah, I'd accept that position. That is, I basically agree with you here.
Alternately (probably a bit more general but, I think, capturing the main relevant offensive bits) "goal systems which do not assign inherent terminal value to persons, but only see them in terms of instrumental value are immoral goal systems."