I'm really confused about why you're not understanding this. Authorities are reliable to different degrees about different things. If I tell you I'm wearing an orange shirt that is clearly evidence that I am wearing an orange shirt. If a physicist tells you nothing can accelerate past the speed of light that is evidence that nothing can accelerate past the speed of light. Now, because people can be untrustworthy there are many circumstances in which witness testimony is less reliable than personal observation. But it would be rather bothersome to upload a picture of me in my shirt to you. It can also be difficult to explain special relativity and the evidence for it in a short time span. In cases like these we must settle for the testimony of authorities. This does not make appeals to authority invalid.
Now of course you might have some evidence that suggests I am not a reliable reporter of the color of my shirt. Perhaps I have lied to you many times before or I have incentives to be dishonest. In these cases it is appropriate to discount my testimony to the degree that I am unreliable. But this is not a special problem with appeals to authority. If you have reason to think you are hallucinating, perhaps because of the LSD you took an hour ago, you should appropriately discount your eyes telling you that the trees are waving at you.
Now since appeals to authority, like other kinds of sources of information, are not 100% reliable it makes sense to discuss the judgment of authorities in detail. Even if Eliezer is a reliable authority on lots of things it is a good idea to examine his reasoning. In this regard you are correct to demand arguments beyond "Eliezer says so". But it is none the less not the case that "appeals to authority are always fallacious". On the contrary modern science would be impossible without them since no one can possibly make all the observations necessary to support the reliability of a modern scientific theory.
LessWrongers as a group are often accused of talking about rationality without putting it into practice (for an elaborated discussion of this see Self-Improvement or Shiny Distraction: Why Less Wrong is anti-Instrumental Rationality). This behavior is particularly insidious because it is self-reinforcing: it will attract more armchair rationalists to LessWrong who will in turn reinforce the trend in an affective death spiral until LessWrong is a community of utilitarian apologists akin to the internet communities of anorexics who congratulate each other on their weight loss. It will be a community where instead of discussing practical ways to "overcome bias" (the original intent of the sequences) we discuss arcane decision theories, who gets to be in our CEV, and the most rational birthday presents (sound familiar?).
A recent attempt to counter this trend or at least make us feel better about it was a series of discussions on "leveling up": accomplishing a set of practical well-defined goals to increment your rationalist "level". It's hard to see how these goals fit into a long-term plan to achieve anything besides self-improvement for its own sake. Indeed, the article begins by priming us with a renaissance-man inspired quote and stands in stark contrast to articles emphasizing practical altruism such as "efficient charity"
So what's the solution? I don't know. However I can tell you a few things about the solution, whatever it may be:
Whatever you may decide to do, be sure it follows these principles. If none of your plans align with these guidelines then construct a new one, on the spot, immediately. Just do something: every moment you sit hundreds of thousands are dying and billions are suffering. Under your judgement your plan can self-modify in the future to overcome its flaws. Become an optimization process; shut up and calculate.
I declare Crocker's rules on the writing style of this post.