De-botting wikipedia

On: September 20, 2009
Print Friendly, PDF & Email
About Kimberley Spreeuwenberg
Currently I am a Master student of New Media at the UvA. In 2007 I graduated in Graphic Design at ArtEZ, Arnhem. During the study at ArtEZ I was introduced to some ‘grande’ theorists, like McLuhan and Manovich. After working as a graphic designer for one year I decided to expand my knowledge of the media I use as a professional designer and the way these media influence society. My interests in media are very broad, but I am especially focused on Internet and Internet culture. At this time I still work as a graphic designer. In my assignments I combine low and high technology tools (analogue and digital techniques). Visit my site!


De-botting wikipedia. What would Wikipedia look like without the bots?

Wikipedia is praised for its open “everybody can contribute” system and it’s collaborative knowledge production. An environment that is seemingly built by human editors, but where in fact bots do much of the work. Since 2002, Wikipedia entries have been maintained not only by humans, but also by bots, and humans assisted by administrative and monitoring tools. [1]

Bots are automated or semi-automated tools that carry out repetitive and mundane tasks. [2] Users can make their own bots, but certain prior knowledge about programming is needed. [3] In the early days of Wikipedia there were bots that imported whole entries into Wikipedia. The so-called rambot, for example, created approximately 30,000 city articles. [4] Now there are also bots that check spelling, lay-out and even interwiki links. Other bots, like Cluebot, make sure that certain forms of vandalism are reverted within a second. [5] The idea that only humans edit and in some way control the content of Wikipedia entries can therefore be questioned. Sabine Niederer points out, in her study about the technicity of the content of Wikipedia, that bots have more permissions than registered users. She argues that “this raises questions about their tasks, and the dependency of Wikipedia on these nonhuman content agents”. [1]

Up till now research has focused mainly on the reliability and open editing system of Wikipedia. Since Wikipedia is dependent on bots Niederer proposes another approach to Wikipedia research. She asks the question ‘how dependent the various user groups and the Wikipedia content are on the (underlying) technology?’ She argues that “the technology helps shape the content, not only by a system of notifications and tools, but also by Wikipedia’s non-human content agents, the bots.” She proposes a study of Wikipedia according to the technicity of its content. [1]

Small Research
In relation to these ideas Anne Helmond and I researched how much humans edit, within a particular entry (in this case Climate Change), in relation to bots. An interesting finding was that 53% (180 out of 339) of the bot edits were made by humans assisted by tools. [5] Our research also showed that the anti-vandalism bots play a prominent role in the Climate Change entry. In order to get a sense of what Wikipedia would look like without these bots we made a small animation (see bottum of post) of the Climate Change article. We reverted the anti-vandalism bot edits and highlighting the vandalism. [6]

(This research took place during the Digital Methods Initiative Summerschool of 2009.)

More Questions
Other questions that build on this line of thinking are formulated by the CPOV: Critical Point of View. “To what extent has bot politics triumphed over vernacular expertise or lead to an empowerment of the e-tech geeks in knowledge projects? Related to this is the question of the cultural history of Wikipedia as a platform. What is the relation between policy formation and technical protocols? Is Wikipedia knowledge Cybernetic?” [7]

[1] Niederer, Sabine (2009). “Wikipedia and the Composition of the Crowd,” unpublished ms.

The animation shows the edits that have been removed by (anti-vandalism) bots over time.
vandalism bot animation

Comments are closed.