Wikipedia Unreliable Edits Leads to Color-Coded Text

Dennis Faas's picture

Wikipedia.com is planning to introduce a feature which will color-code parts of its articles in order to show how reliable the information is likely to be. It's a solution to an ongoing problem that has led to a planned freeze on edits to entries about living individuals.

The WikiTrust feature analyzes the text on a page and ranks each piece of text based on how recently it was edited. This is based on the idea that the longer a piece remains unchanged, the more likely it is that it is trustworthy -- otherwise (it is reasoned), Wikipedia's users would have overturned the edit.

The feature also takes into account the reputation of the author; that's based on how long their past edits stayed active before they were undone, or if they are still active.

These two factors are combined into an automated ranking. The background of the text is then highlighted (if necessary), with the most trustworthy text left as white and the rest as shades of orange, getting darker as the text becomes less trustworthy.

WikiTrust a Third-Party Effort

WikiTrust is not the work of Wikipedia itself. It's an independently produced tool that can be used on any wiki-based site, of which Wikipedia is simply the most prominent. It's been in existence since late last year.

The tool will be built into Wikipedia from later this year. It will only be available to registered users and will only be viewable if a user intentionally switches it on.

WikiTrust: a Scalable Solution?

The big problem with WikiTrust is that it's uncertain how well the tool will work for a site the size of Wikipedia. It may take too much processing power to cover the entire site on an ongoing basis. And though there has been testing, it's still uncertain if the basic concept of the tool will still work on such a major scale. (Source: wired.com)

The tool is the latest attempt to find a balance between maintaining Wikipedia's sandbox nature and avoiding malicious or simply inaccurate edits by users. That dilemma was heightened earlier this month, when the site announced plans to force those people making edits to a page based on a living individual first receive approval by an official Wiki editor. (Source: nytimes.com)

Rate this article: 
No votes yet