We need your help now

Support from readers like you keeps The Journal open.

You are visiting us because we have something you value. Independent, unbiased news that tells the truth. Advertising revenue goes some way to support our mission, but this year it has not been enough.

If you've seen value in our reporting, please contribute what you can, so we can continue to produce accurate and meaningful journalism. For everyone who needs it.

Camarada Bakunin/Flickr
Spot the Difference

Wikipedia's newest editors aren't even human

The new AI system will spot errors to help reduce the workload for its human editors.

WIKIPEDIA IS NOW relying on artificial intelligence (AI) to help it keep track of bad edits on its site.

More than half a million changes are made to Wikipedia’s articles every day and is usually monitored by a large number of volunteers who ensure that all edits are correct and factual. Each edit has to be reviewed manually so the organisation is looking to AI to help out.

The Objective Revision Evaluation Service (ORES) is said to be like “a pair of X-ray specs”, highlighting edits that could be potentially damaging.

“By combining open data and open source machine learning algorithms, our goal is to make quality control in Wikipedia more transparent, auditable and easy to experiment with”, says the Wikimedia Foundation in a blog post.

Our hope is that ORES will enable critical advancements in how we do quality control – changes that will both make quality control work more efficient and make Wikipedia a more welcoming place for new editors.

The system works by training it against edit and article quality assessments made by the site’s editors and giving scores for both edits and articles. The AI judges the quality of an edit by looking at the context of the change and the language used.

The idea is that by using smarter AI to tackle obvious errors or changes, it would take some of the workload off current volunteers and encourage new ones to join up and contribute.

Alongside the introduction of ORES, the organisation is working on three different things to improve the service such as classifying edit types and bias detection.

Read: Find keyboard shortcuts a little difficult to do? Turn on Stickykeys >

Read: WhatsApp has been accused of blocking links to this rival app >

Your Voice
Readers Comments
    Submit a report
    Please help us understand how this comment violates our community guidelines.
    Thank you for the feedback
    Your feedback has been sent to our team for review.