I notice that expert opinion always tends to get the short shrift in these. But for many fields I find myself hoping for a competent expert who had gone through all the evidence and could tell me which of the meta-analyses and RCTs were actually reliable, which hypotheses are basically zombie theories, et cetera.
And by what metric do you separate the competent experts from the non-competent experts? I also prefer listening to experts because they can explain vast amounts of things in "human" terms, inform me how different things interact and subsequently answer my specific questions. It's just that for any single piece of information you'd rather have a meta-analysis backing you up than an expert opinion.
And by what metric do you separate the competent experts from the non-competent experts?
There's no hard-and-fast rule, obviously, just as there's no hard-and-fast rule for figuring out which meta-analyses you can trust (for problems with meta-analyses, see e.g. [1, 2, 3, 4]). But if the experts explicitly discuss the reasons behind their opinions and e.g. why they think that one particular meta-analysis is decent but another one is flawed, you can try try to evaluate how reasonable their claims sound.
There have been many hierarchies of evidence made for various fields of science. I was looking for an image of a more general hierarchy that could easily be dropped into any online conversation to quickly improve the debate. I found none that had all the features I was looking for. So I took an old hierarchy: expanded it, made it more aesthetically pleasing and made it into a jpeg, pdf and pages-file so people can easily share and modify it (e.g translate it or convert it to different files). Here it is: