Meta is experimenting with using AI to write Wikipedia articles: https://ai.facebook.com/research/publications/generating-full-length-wikipedia-biographies-the-impact-of-gender-bias-on-the-retrieval-based-generation-of-women-biographies
I personally have a very bad feeling about this. I’m most afraid of this making it easier to spam the encyclopedia with fake information that looks plausible on the surface, and therefore doesn't get fully fact-checked. This could also create perverse incentives where SEO companies put out false information online to bias Meta's algorithm, and thereby sneak their way in to the encyclopedia. The decision to make this open source seems incredibly foolish to me as well, considering how easily a service like this could be misused. (Edit: It has been pointed out to me that if they made it closed-source that wouldn't be great either, since then we would have no idea what it was doing under the hood. Either way I wouldn't be happy, so I'm not sure their choice counts as a point against them.)
Am I overreacting? Is this actually a good thing? Is this actually way worse than I think it is? Who knows!
What are your thoughts on this?
I was more thinking about the humans running the AI, not the AI itself having an advantage in the edit wars. If the project gets special privileges and bypasses the normal (and sometimes painful) oversight by human volunteers, it can end up putting incorrect or low-value information in that's easier to create than to improve.
Agreed, if the AI is passing the "wikipedia contributor" turing test, then it's all over anyway.