If it's worth saying, but not worth its own post, then it goes here.
Notes for future OT posters:
1. Please add the 'open_thread' tag.
2. Check if there is an active Open Thread before posting a new one. (Immediately before; refresh the list-of-threads page before posting.)
3. Open Threads should start on Monday, and end on Sunday.
4. Unflag the two options "Notify me of new top level comments on this article" and "
Maybe this has been discussed ad absurdum, but what do people generally think about Facebook being an arbiter of truth?
Right now, Facebook does very little to identify content, only provide it. They faced criticism for allowing fake news to spread on the site, they don't push articles that have retractions, and they just now have added a "contested" flag that's less informative than Wikipedia's.
So the questions are: does Facebook have any responsibility to label/monitor content given that it can provide so much? If so, how? If not, why doesn't this great power (showing you anything you want) come with great responsibility? Finally, if you were to build a site from ground-up, how would you design around the issue of spreading false information?
Let's try to frame this with as little politics as possible...
You build a medium where people can exchange content. Your original goal is to make money, so you want to make it as popular as possible -- in perfect case, the Schelling point for anyone debating anything.
But you notice that certain messages, optimized for virality, make a disproportional fraction of your content. You don't like this... either because you realize you actually have values beyond "making money"... or because you realize that in long term this could have a negative impac... (read more)