In an attempt to improve the quality of its articles, Wikipedia has started using an artificial intelligence engine to automatically analyse changes to Wikipedia.

The AI project is called Objective Revision Evaluation Service (ORES), and according to Wikimedia, the parent company of Wikipedia, the ORES will highlight potentially damaging edits for editors, separating vandalism from genuine edits.

Users will be able to use SciKit Learn, a set of open source machine learning algorithms to identify vandalism.

The company expects the new tool to make the job of its volunteer editors easy who look for despicable changes, and it hopes that the changes will also attract new editors.

Wikimedia said: "Our hope is that ORES will enable critical advancements in how we do quality control — changes that will both make quality control work more efficient and make Wikipedia a more welcoming place for new editors.

"We’ve been testing the service for a few months and more than a dozen editing tools and services are already using it.

"We’re beating the state of the art in the accuracy of our predictions. The service is online right now and it is ready for your experimentation."

Wikipedia crowdsources its articles, but it has implemented strict rules on who can make changes to major documents to ensure the quality of the articles.