The Guardian recently picked up on a piece of research about the behaviour of bots on Wikipedia in the first 10 years of its history. The research noted that as bots became more common, their rules sometimes came into conflict with each other, resulting in some bots changing or reverting thousands of edits by other bots.
Bots can be incredibly useful. They do a lot of the repetitive and mundane edits that would take humans ages and can be automated reasonably easily. One of the most notable examples is Lsjbot, which has created millions of Wikipedia stub pages in Swedish and Cebuano (a Philippine language and the mother tongue of the Swedish bot creator’s wife).
This has resulted in Swedish and Cebuano punching far above their weight in terms of the number of articles in their respective languages. The Welsh language Wicipedia has also expanded considerably using automated stub creation. Aside from this, one of the major uses of bots is to protect against vandalism, for example by reverting edits containing swear words.
Of course, the work of bots has not gone without criticism. The large-scale creation of articles (on the English language Wikipedia at least) has been restricted and a bot policy has been implemented to regulate the kinds of tasks that bots are allowed to perform. The Wikipedia page about bots states that:
‘Bots are able to make edits very rapidly and can disrupt Wikipedia if they are incorrectly designed or operated. For these reasons, a bot policy has been developed.’
Bots perform mundane but vital tasks such as:
- User:AAlertBot – delivering article alerts to WikiProjects about ongoing discussions.
- User:AnomieBOT – large variety of tasks, most well known for adding dates to amboxes.
- User:BracketBot – notifies users of mismatched brackets in recently edited articles.
- User:ClueBot NG – reverts vandalism.
- User:CorenSearchBot – checks for copyright violations on new pages.
- User:Cydebot – generally carries out tasks associated with deletion.
- User:DumbBOT – often removes protection templates from recently unprotected pages.
- User:ListeriaBot – Experimental bot by Magnus Manske. It generates and updates lists on Wikipedia.
- User:Lowercase sigmabot – often adds protection templates to recently protected pages.
- User:Lowercase sigmabot III – archives talk pages.
- User:Mr.Z-bot – will patrol the living persons and the edit filters.
- User:ProcseeBot – automatically blocks proxies due to both the local policy against open proxies.
- User:SineBot – signs comments left on talk pages.
- User:Yobot – syntax fixes and tagging.
- User:WP 1.0 bot – tracks article quality.
The way in which Wikipedia operates can be seen as labyrinthine; a trait which is somewhat unavoidable in large, open source communities like the Wikimedia projects. So perhaps it’s not surprising that journalists often write about ‘murky and opaque’ underbelly of Wikipedia without capturing the full complexity of how the websites work. Once you get involved and see how it works, it’s usually not as complex or strange as it might seem at first glance.
Editing Wikipedia and its sister projects is great for anyone who wants to become more IT literate and understand how the complex series of relationships and data flows that we call the internet works. Curating the largest store of information and the biggest human project in the history of the world isn’t easy, and it takes a lot of people a lot of time to make Wikipedia as good as it is. We think it’s worth it, and can bring a lot of personal development and satisfaction to the people who do it.
So why not join in? To get started, read our recent blog post on starting to edit Wikipedia here.