Wikipedia assignment: Is ‘WikiTrust’ the next step towards a more reliable online-encyclopaedia?

On: September 20, 2009
Print Friendly, PDF & Email
About Sander Leegwater
Hi, I'm Alexander (or Sander for friends) Leegwater – a Multimedia Designer, Bachelor in the Interactive Media at the Hogeschool van Amsterdam – and currently working on my Masters of (New) Media at the 'Universiteit van Amsterdam'. Besides schooling, I'm working as a part-time 'front-end' web-developer at


Once again the online-encyclopaedia Wikipedia is in the spotlight of attention; after years of various forms of praise, critique and academic studies – in which several different and overlapping aspects of Wikipedia have been discussed – the encyclopaedia has decided once again it wants to increase its authority on the internet. This should be achieved (somewhere this fall [1]) by enhancing Wikipedia’s reliability through the use of coloured text; this technique, called WikiTrust, is a project of professor Luca de Alfaro [2] from the University of California, Santa Cruz, which “is part of ongoing work [..] on reputation systems, online collaboration, and information trust.” The name ‘WikiTrust‘ can be a little misleading though, since it has no way of telling how trustworthy texts really are. “‘It can only measure user agreement,’ said de Alfaro. ‘That’s what it does [2].'”

Cross example of a colorized Wikipedia article

Fig. 1: Cross' example of colorized text.

The idea – of using coloured text in Wikipedia’s articles to increase reliability – isn’t exactly ‘new’; Tom Cross “.. a B.S. in Computer Engineering from the Georgia Institute of Technology”, wrote in 2006 the article ‘Puppy smoothies: Improving the reliability of open, collaborative wikis‘, which was published on the online, open, peer-reviewed journal First Monday. Herein he argues that the text in Wikipedia’s article’s should be coloured on its venerability – or grounded on the time that text has been in an article [Fig. 1] – his “.. proposal relies on the philosophy that bad information is less likely to survive a collaborative editing process over large numbers of edits (Cross, 2006).” This should raise the confidence that users have in Wikipedia, because they are provided with a visual tool which shows how venerable a particular piece of text is and with this users should be able to appraise the worth of an article. Cross’ thoughts had been mentioned again by Virgil Griffith, “a graduate student at the California Institute of Technology“, who programmed the WikiScanner in 2007 to increase the reliability of Wikipedia’s more ‘controversial’ topics. In his original WikiScanner FAQ, Griffith wrote: “Overall–especially for non-controversial topics–Wikipedia seems to work [..] As for other approaches, I think colored text is a promising direction for combating disinformation in wikipedia (2008).”

Now (in 2009) the basis of Cross’ brainchild, will have a test-run in the form of WikiTrust [Fig.2], which is essentially an extension of MediaWiki – the ‘Content Management System’ which is the ‘engine’ of Wikipedia and many others – “that implements an author reputation system, and a text trust system, for wikis. WikiTrust adds to a wiki a check text tab that enables any visitor to check the author, origin, and reliability of wiki text. Thus, visitors can easily spot spam, surreptitious changes, and information tampering [3].” WikiTrust works real-time, meaning that edits are analysed as users are typing (suddenly beginning to feel a little paranoid while writing this…) and by doing so it calculates the reputation of the author, trust of the text and its origin. According to the Wiki it should have the following functionalities:

  • Fig. 2: WikiTrust coloured text example

    Fig. 2: WikiTrust coloured text example.

    Text author: The author of every word in a text will be computed by an algorithm which should be able to resist “cut-and-pase, delete-and-reinsert, and most type of attacks ([WikiTrust developers even] claim, all attacks — try your hand at it and let [them] know!).” By clicking on a “check text tab, when the mouse pointer hovers over a word, [the original] author of the word [3]” should be shown by a small pop-up in the screen.

  • Text origin: By keeping track of the authors, WikiTrust is also (supposed to be) able to keep track of the revision of every word – when it was inserted or introduced and more importantly, by whom – this allows for examination of all edits (again, in the check text tab) and gives access to information on the author of the edit.
  • Text trust: Trust of parts of text will be calculated through the ‘reputation’ of the original author and all other users who have made revisions to it, the ‘trust outcome’ will then be displayed (again, in the check text tab) “via text background colors [..]: the background is white for high-trust text, and shades of orange that are the stronger, the lower the text trust [3].”
  • Author reputation: The reputation of the author will be calculated from the evolution of the content – “authors who provide lasting contributions gain reputation, while authors whose contributions are reverted in short order lose reputation” – in doing so, the hopes are that the system of reputation will provide “an incentive towards constructive behaviour [3].”

“The goal of the WikiTrust project is to facilitate online content creation and sharing. WikiTrust flags recent content changes that need scrutiny, and offers the text-tracking tools to investigate the context in which such changes were made. WikiTrust strives to benefit wiki authors, editors, and visitors [3].” The question of course remains… will these developments enhance Wikipedia’s reliability? In my opinion they will, for the reliability and credibility has already been proven to be moderate to high (Chesney, 2006). And in “.. an expert-led investigation carried out by Nature — the first to use peer review to compare Wikipedia and [the famous English encyclopaedia Britannica] coverage of science (Giles, 2005) –” the experts ruled that Wikipedia’s articles came close to Britannica’s in terms of accuracy. Of course, technical extensions like WikiTrust, will not automatically solve every issue within Wikpedia but every improvement will get us one step closer towards a better version. It seems to me that the academic community is still very ‘charmed’ by the Wikipedia – in several researches its been mentioned that is a great background reading to several topics – but it still remains questionable to use it as an authoritative source.

Sources & further reading:

[1] Leggett, H., 2009. ‘Wikipedia to Color Code Untrustworthy Text’, Wired, August 30.

[2] Claburn, T., 2009. ‘Wikipedia Considers Coloring Untested Text’, InformationWeek, augustus 31.

[3] WikiTrust, Wiki:Main_Page,

Comments are closed.