It's a nice dream, and I would excited if you could do it, but I don't think it is possible given the reality of the modern sociopolitical situation. What I think you don't appreciate is that at the end of the day, most people really really don't care about building a better world. They care about promoting their own status, defeating their enemies, and justifying their hatreds.
To justify this claim, I'll cite a few Arthur Chu Facebook quotes on the subject of LessWrong and rationality:
Arthur Chu For the peanut gallery -- what people are deliberately dredging up is that I hate the Less Wrong/"rationalist" community precisely because of its "We are nerdy white guys, here to tell you why you are wrong" culture and its intense defensiveness and insecurity (sorry, "immune system") at being called on being a haven for bullshit, including stuff like "Stop saying nerdy white guy, that is its OWN FORM OF RACISM"!
Arthur Chu Oh, and if you don't even know what Less Wrong is, it's basically a nerdy white guy religion that started out as a bunch of people gathering donations to freeze themselves until a Computer AI Jesus can be built and create utopia. And if you've ever heard of it but don't give all your money to it Computer AI Jesus will rebuild you in the future and put you in Computer AI Hell. It's spun off into a whole bunch of other shit since but it's not a group of people that really has any business lecturing people on who is and isn't "sane".
Arthur Chu is especially willing to voice his prejudices, but my strong suspicion is that most other people think the same way. So even if you set up a completely objective and rational truth-finding web site, it would simply be attacked and destroyed by political actors for being a racist religion or for being run by nerdy white guys or whatever.
I don't feel like digging up the whole sordid backstory (though this would be a good starting point), but I get the impression he's upset that we're not a vector for his politics.
That whole "mindkiller" thing really rubs some people the wrong way; for such a person, politics are so bound up with ideals of rationality that staying away from them looks not just ignorant but willfully and maliciously so. (Compare the "reality-based community" on the left, or Eric Raymond's "anti-idiotarianism" on the right. Not that we're entirely innocent of this sort of thinking ourselves.) Combine that with the absurdity heuristic and our bad habit of parochialism in some areas, and you've got most of the ingredients for a hatchet job.
I don't feel like digging up the whole sordid backstory (though this would be a good starting point), but I get the impression he's upset that we're not a vector for his politics.
More specifically, he's upset that we're willing to tolerate people who point out that many of his ideology's claims are in fact falsifiable and false.
Argumentative cheaters thrive because their arguments aren't properly scrutinized.
This statement does not pass the "fact check".
People have been repeatedly shown to believe what they want to believe for various reasons including status, affiliation, cognitive dissonance, convenience and many others. They happily overlook and downplay the " fallacies, half-lies, evasions" by the home team while emphasizing those of the opponents/enemies.
The factcheck.org site hardly made a dent in the misrepresentations, and is rarely mentioned as an impartial fact checking site (I do not know whether it is one).
A better question to ask is "how to make people care for accuracy and impartiality?" Eliezer's approach was Hanson's OB-inspired "raising the sanity waterline", eventually evolving into CFAR, with a limited success so far. Maybe there are other options, who knows.
Argumentative cheaters thrive because their arguments aren't properly scrutinized.
This statement does not pass the "fact check".
Well if scrutiny didn't do any good then why do we have peer review in science? This is a sort of peer review (but hopefully more effective than the standard scientific peer review) on a massive scale.
It's just obvious that rational criticism generally does improve argumentative standards.
Well if scrutiny didn't do any good then why do we have peer review in science?
a) scientists are slightly better than average in caring for accuracy
b) there is a contradictory evidence whether peer-reviews improve publication quality
It's just obvious that rational criticism generally does improve argumentative standards.
Eh... "obvious" is not a good criterion for either impartiality or accuracy.
1) There are different definitions of a fallacy. What I am talking of here are clear cases of argumentative cheating. 2) I do think that factchecking does help, yes. Politicians would have lied much more if they hadn't known that they could be caught out with those lies.
What I am talking of here are clear cases of argumentative cheating.
Most people would consider ad hominems cheating if it were pointed out to them.
I do think that factchecking does help, yes.
Based on...?
I do think that factchecking does help, yes.
Based on...?
Media do now and then reveal that politicians have lied on important topics (Watergate, Clinton on Lewinsky, etc). This a) had negative political consequences for the lying politicians and b) arguably made all other politicians less likely to lie (since these incidents taught them which consequences that could have), though this latter point is harder to prove.
See also my comment above.
So, your justification for the claim that factchecking improves politics is based on 2 anecdotes: a scandal from 40 years ago; and another scandal from 20 years ago which to many epitomizes the irrational & tribal nature of politics in which partisan hacks look for any excuse to attack an enemy no matter how trivial or unrelated to the job of governing the country it is?
Obviously not. Please apply the principle of charity. These are some salient examples. Of course there are others.
You're a smart guy. I can't understand why you're being so nit-picky. It's not helpful.
gwern might be a smart guy, but he is below average at charitably interpreting opposing arguments, at least this is my impression based on my interaction with him here and on IRC. It's not an uncommon failing, Eliezer comes across as uncharitable, as well, especially when dealing with those perceived as lower status (he was very very charitable to Karnofsky).
Of course, the impression of uncharitabilty (uncharitableness? is it even a word?) is often given off when the person is a few levels above you and goes through the most charitable interpretations of your argument in their head quickly, realizes that they are all wrong, as well, and rejects the argument without explicitly discussing why the charitable versions are no better than the uncharitable ones. I don't know how to tell the difference.
Obviously not. Please apply the principle of charity. These are some salient examples. Of course there are others.
Of course there are others, but I am not interested in arguing by anecdote especially when the anecdotes don't seem to support your thesis. (Seriously, of all the scandals you had to pick the Lewinsky scandal?) What exactly am I supposed to be applying charity to here? Do you have any systematic, concrete, empirical data that supports your claim that factchecking improves politics?
Have you noticed it helping?
Maybe these were not well organized enough or didn't reach a critical mass.
There are related organizations like Vroniplag (explained on wikipedia) which did have a very notable effect - at least in Germany. These are specialized in pointing out very grave errors in doctoral thesis - esp. plagiarism and so can and do have significant consequences for the subject under scrutinity.
I think if you could reach a significant mass this could work.
Maybe these were not well organized enough or didn't reach a critical mass.
How were they not well-organized? Why do you think this sort of phenomenon has any sort of 'critical mass' effect to it? And why would any future effort not be doomed to fail to reach the critical mass just like all the past ones obviously?
These are specialized in pointing out very grave errors in doctoral thesis - esp. plagiarism and so can and do have significant consequences for the subject under scrutinity.
If that's the best you can point to, that does not fill me with hope. When are political questions ever as clear as copy-paste plagiarization? That is not a success story, that's something that fills me with horror - things are even worse than I thought:
Most of these revocations have held up in court. However, some universities disagreed with VroniPlag finding, even in cases of blatant plagiarism (between 40 and 70% of pages affected with plagiarism). The correct methods for dealing with plagiarism – and its prevention – remains an ongoing discussion in Germany.
And you hope factchecking can make a difference in real politics?!
Well. Politics is the mind-killer. Surely such a fact-checking site would be prone to all the hacks politics can master to ''limit'' its effect. Wikipedia and Vroniplag are good (real: illustrative) examples of this.
Whether I have ''hope''? My post wasn't about hope but intended to point out structures with 'critical mass' that did have an effect. One can learn from that. How to build on these, tweak their logic to maybe achieve a better result.
A critical mass is in my opinion always needed to have any noticable effect because local uncoordinated effects are dealt with by self-stabilizing effects of the existing norms (politic powers can use e.g. regression toward the mean, coordinated salami tactics, fogging and noise).
Politics is the mind-killer. Surely such a fact-checking site would be prone to all the hacks politics can master to ''limit'' its effect.
Not to mention the fact-checkers themselves are subject to being mind-killed.
This is kind of a tangent to the subject, but seeing someone bring up Critical Rationalism on Less Wrong still brings up some pretty massive negative associations for me. By far the majority of all the mentions of Critical Rationalism on this site, due to monomania and prolific rate of posting, come from the author of the most downvoted post ever to appear on Main, and possibly the most disruptive user ever to frequent the site.
I'm thinking that the website should be strongly devoted to neutrality or objectivity, as is Wikipedia.
In Wikipedia part of being objective means to accurately report what's generally known about a topic and not engaging in original research. I think you will have a hard time to judge fallacies without people engaging in something like original research. Fox news has it's own brand of fair & balanced with isn't exactly the same thing that most people think of when they hear the phrase.
What kind of objectivity do you want for your website?
The status quo bias of one person is another person's Chesterton's fence.
In a better coordinated world in which more people cared about truth there would exist certification organizations that, for a fee, would read an article and if the article met certain standards would issue an "honest argument" certification that could be displayed on the article. Having such a certification would, ideally, attract more viewers giving the author more advertising revenue which, also ideally, would more than pay for the cost of certification.
I've had the same idea. Such certification organizations could also certify e.g. ads. This could potentially bring in lots of profits if the certification organization had sufficiently good reputation, since companies have the money to pay for such certificates. (Of course, it would be important, though, that the certification organization weren't more lenient on companies which paid more, since that would ruin their reputation).
Monolithic vs subjective As pointed out it's hard to gather everyones input to a single result. Rather than have a single fallacy / not fallacy rating have each user be able to express (and own) whether a statement is fallacious. In a usual case it would have the result of "95.4% people think this is a false dictonomy". However there is valuable information on cross-correlating what arguments pass which evaluator. You could have functionality to "ignore all evaluators that think this is a fair argument". People could also profilate themselfs as being quality evaluators. There is a problem/feature where the standard of the evaluator need not be rigour. You could for example have a profilic evaluator for each major political leaning. Or you could aggregate the information by cross-referencing proclaimed political identity ie "65% of self-identified democrats think this argument is fair"
Applicability vs context Being able to target already produced texts means there would be wide applicability. However I am a little concerned on selection effects on what makes it as a "thing to scrutinize". This kind of thing would be effective about small isolated arguments. However politicians that fit their arguments to fit the situation they are presented in could be wrongly presented in being judged outside of that speech situation. Maybe they know that there are better / more valid arguments for their position but choose to utter those they know their audience can relate to. Bringing those arguments under a close scrutiny would be to partly miss the point. I guess part of the idea would be to apply pressure to always use arguments that could pass harsher standards? However I can see many downsides to that. I would rather have all the arguments to be processed to be explicitly (re)created in the context of the website. Then it would be clear that everybody involved respects the clean play attitude and that the arguments are meant to be elaborate and precise. This could mean that only the core and essential points would be covered. That is, it would not be a witch hunt to harass other medias but be an internal matter.
explicitness vs summary score I would have each argument input in a special language/notation that forces every argument to be explicit and computer readable. The arguments would not be prose but collections and networks of semantic tokens. This would provide human language independence ie french and english users would render the tokens in their language but they would be manipulating the same exact ones when one makes a claim in french it would be accessible to the english user too. With the guarantee of computer readableness you could things like compare the axioms of two users and point where contradict, at such a point a discussion is possible. You could then track how often did those discussion shift opinions and which arguments were effective at which populations / belief bases. This could easily be rendered a tool for anti-knowledge seeking testing which manipulations work the best. If such a reduction is not done the meaning of any end result will be a bit nebulous. Its meaning would depend on the process by which it is produced and it would mask approval of a group in the guise of numeric inarguable data. If the vision of what the "clean play" consist off it could be useful but I doubt there is a single axis that would be so critically important to track. I would rather have metrics that tell stuff but don't give a conclusion than reach a conclusion I am not sure what it tells.
I'm thinking that the website should be strongly devoted to neutrality or objectivity, as is Wikipedia.
The problem is that wikipedia isn't that good at finding the truth about controversial topics.
The problem is that wikipedia isn't that good at finding the truth about controversial topics.
What is?
Such a website would probably be a good idea-- you don't need to have everyone using it, you just need a good-sized audience. Snopes and polifact are doing alright.
To my mind, the interesting question is choosing a manageable task that will make it easy for the site to grow.
This idea really isn't that original. Philosopher, journalists, and others have done this for quite some time, and this has, I'm confident, had significant positive effects on the public debate. Religious people were forced to improve their arguments as a result of philosophical criticism from Hume and others, for instance. Also, even if you might think that the political debate is bad, imagine how bad it would have been if there were no journalists reporting relatively objectively on politics.
My suggestion is thus just to make existing efforts at objective reporting and criticism more systematic and comprehensive: it's not a qualitative leap, but just a quantitative shift. I don't see why that would be impossible in principle (though it may be hard in practice) or why that would not have any effect, given how much our present institutions for objective criticism actually has achieved (in my view).
I have a very good friend who has taught collegiate level debate for forty years. Just before he retired, he did an experiment where he and his students would actually do what you are proposing here and point out, list, highlight and rebut the various forms of argumentative cheating on Facebook and Twitter and see what happened. The result? Of the thirty students in one class, seventeen were banned from groups and had their friends list drop to below 50 people. The remaining students found that there followers dropped, that they received less over all views, and that they became targets of more and more abusive comments with threads that would begin as discussions and quickly de-escalate to that all time internet favorite the argumentum ad hominum. That pattern held good across eight classes and 240 students (give or take). I grant that one professors off the cuff experiment does not equal solid research but it is perhaps a bit disheartening.
And that leaves out both cognitive dissonance and the backfire effect. No one wants to be proven wrong, and many will fight for their worldviews and the facts that fit them, even facts that are not facts.
And given that people also frequently admit to not using the (biased but) existent research tools that are already out there why would they bother to search for and use a research tool designed for something like this? I can see such a site rapidly being used as a weapon by one side of an argument and being shouted down as "fake news" by the other.
Also, just as an aside, I am a student at Duke University- School of Medicine and we are absolutely FORBIDDEN to use Wikipedia or any other crowd-sourced knowledge base as a source in our own researches. As one of the administrators pointed out, at least eleven of the professorial staff and hundreds of the students contributed to Wiki at any given time and it was a common hobby to go rewrite a rival's work.
There have been many proposals like this before. My favorite idea (which I cannot recall the name of right now) was a browser plugin that would overlay annotations onto arbitrary webpages. People could make it highlight certain questionable bits of text, link to opposing viewpoints or data, and discuss with each other whether the thing was accurate. Imagine a wiki talk page, but for every conceivable site.
There have been several of these. Thiblo is one; it no longer exists. I remember using another one once, but I can't remember what it was now.
There is also that thing that allows for arbitrary pictures to be drawn over arbitrary webpages. I can't remember what it was called either but it was mostly used for low-quality Homestuck porn.
a browser plugin that would overlay annotations onto arbitrary webpages. People could make it highlight certain questionable bits of text, link to opposing viewpoints or data, and discuss with each other whether the thing was accurate.
And how would it deal with spam?
I doubt that a browser plugin which festoons each page with giant INCREASE YOUR MANHOOD "annotations" is going to be popular.
I dunno. CAPTCHAs, plus community policing? I don't remember whether it got off the ground so for all I know that might have killed it anyway.
The public debate is rife with fallacies, half-lies, evasions of counter-arguments, etc. Many of these are easy to spot for a careful and intelligent reader/viewer - particularly one who is acquainted with the most common logical fallacies and cognitive biases. However, most people arguably often fail to spot them (if they didn't, then these fallacies and half-lies wouldn't be as effective as they are). Blatant lies are often (but not always) recognized as such, but these more subtle forms of argumentative cheating (which I shall use as a catch-all phrase from now on) usually aren't (which is why they are more frequent).
The fact that these forms of argumentative cheating are a) very common and b) usually easy to point out suggests that impartial referees who painstakingly pointed out these errors could do a tremendous amount of good for the standards of the public debate. What I am envisioning is a website like factcheck.org but which would not focus primarily on fact-checking (since, like I said, most politicians are already wary of getting caught out with false statements of fact) but rather on subtler forms of argumentative cheating.
Ideally, the site would go through election debates, influential opinion pieces, etc, more or less line by line, pointing out fallacies, biases, evasions, etc. For the reader who wouldn't want to read all this detailed criticism, the site would also give an overall rating of the level of argumentative cheating (say from 0 to 10) in a particular article, televised debate, etc. Politicians and others could also be given an overall cheating rating, which would be a function of their cheating ratings in individual articles and debates. Like any rating system, this system would serve both to give citizens reliable information of which arguments, which articles, and which people, are to be trusted, and to force politicians and other public figures to argue in a more honest fashion. In other words, it would have both have an information-disseminating function and a socializing function.
How would such a website be set up? An obvious suggestion is to run it as a wiki, where anyone could contribute. Of course, this wiki would have to be very heavily moderated - probably more so than Wikipedia - since people are bound to disagree on whether controversial figures' arguments really are fallacious or not. Presumably you will be forced to banish trolls and political activists on a grand scale, but hopefully this wouldn't be an unsurmountable problem.
I'm thinking that the website should be strongly devoted to neutrality or objectivity, as is Wikipedia. To further this end, it is probably better to give the arguer under evaluation the benefit of the doubt in borderline cases. This would be a way of avoiding endless edit wars and ensure objectivity. Also, it's a way of making the contributors to the site concencrate their efforts on the more outrageous cases of cheating (which there are many of in most political debates and articles, in my view).
The hope is that a website like this would make the public debate transparent to an unprecedented degree. Argumentative cheaters thrive because their arguments aren't properly scrutinized. If light is shone on the public debate, it will become clear who cheats and who doesn't, which will give people strong incentives not to cheat. If people respected the site's neutrality, its objectivity and its integrity, and read what it said, it would in effect become impossible for politicians and others to bullshit the way they do today. This could mark the beginning of the realization of an old dream of philosophers: The End of Bullshit at the hands of systematic criticism. Important names in this venerable tradition include David Hume, Rudolf Carnap and the other logical positivists, and not the least, the guy standing statue outside my room, the "critical rationalist" (an apt name for this enterprise) Karl Popper.
Even though politics is an area where bullshit is perhaps especially common, and one where it does an exceptional degree of harm (e.g. vicious political movements such as Nazism are usually steeped in bullshit) it is also common and harmful in many other areas, such as science, religion, advertising. Ideally critical rationalists should go after bullshit in all areas (as far as possible). My hunch is, though, that it would be a good idea to start off with politics, since it's an area that gets lots of attention and where well-written criticism could have an immediate impact.