AI Translations Are Adding 'Hallucinations' to Wikipedia Articles [View all]
Source: 404 Media
Wikipedia editors have implemented new policies and restricted a number of contributors who were paid to use AI to translate existing Wikipedia articles into other languages after they discovered these AI translations added AI hallucinations, or errors, to the resulting article.
The new restrictions show how Wikipedia editors continue to fight the flood of generative AI across the internet from diminishing the reliability of the worlds largest repository of knowledge. The incident also reveals how even well-intentioned efforts to expand Wikipedia are prone to errors when they rely on generative AI, and how theyre remedied by Wikipedias open governance model.
The issue in this case starts with an organization called the Open Knowledge Association (OKA), a non-profit organization dedicated to improving Wikipedia and other open platforms.
-snip-
Wikipedia editors investigated how OKA was operating and found that it was mostly relying on cheap labor from contractors in the Global South, and that these contractors were instructed to copy/paste articles to popular LLMs to produce translations.
-snip-
Read more: https://www.404media.co/ai-translations-are-adding-hallucinations-to-wikipedia-articles/
No surprise that this went badly.
The article mentions a job posting from OKA offering $397/mo for working up to 40 hours a week, translating as many as 20 articles per week. OKA originally instructed its translators to use Grok, but has since changed that recommendation to other chatbots.
Any of which can also hallucinate at any time.