Wikipedia Bots Argue With Each Other For Years Before Settling On A Subject


Wikipedia Bots Argue With Each Other For Years Before Settling On A Subject

Wikipedia Bots Argue With Each Other For Years Before Settling On A Subject
Wikipedia Bots Argue With Each Other For Years Before Settling On A Subject

Wikipedia Bots do a lot of work on our favourite and mostly reliable information source. But according to computer scientists from the University of Oxford and the Alan Turing Institute in the United Kingdom have analysed how both humans and Wikipedia bots interact on the online encyclopaedia.

Both volunteers and bots alike do work on Wikipedia in order to maintain it as accurate as possible. But like humans, Wikipedia bots may go back and forth on a subject, changing each other’s posts for years before they settle on a subject.

As you may or may not know, most of the internet’s traffic isn’t made by humans. According to the security firm Imperva, on average, bots account for about 52 percent of all web traffic, including spamming comments, forcing passwords, as well as injecting malicious content such as malware. But, of course, bots aren’t all bad since they do a lot of tedious jobs that we human’s don’t really want to do.

Wikipedia bots are no different, and the website couldn’t work without them. They edit millions of pages annually and are responsible for all the tedious jobs like formatting sources and adding links. Some of them can even start new pages which have minimal content – such as stubs- which can get a conversation going.

Thomas Steiner from Google Germany monitored Wikipedia bot activity across all of the 287 language versions. In 2014, he observed that about half of the job was done by humans while the other half was done by bots. But there are stark differences when it comes to each individual language.

For instance, only about 5% of the edits done on the articles in Englis are made by bots, whereas in Vietnamese, Wikipedia bots accounted for 94% of all activity. And despite this discrepancy, Wikipedia remains quite reliable and has an accuracy of 95%, which is better than many textbooks.

Professor Taha Yasseri from the Oxford Internet Institute, alongside some of his colleagues wanted to see how these Wikipedia bots interact with each other, given that many of their tasks overlap on occasion. The team analysed the data across 13 languages over a period spanning from 2001 to 2010. And in the analysis, they observed that bots often times interacted with each other with some unpredictable consequences.

To their surprise, the team realised that the Wikipedia bots acted more like humans, depending on the cultural context of the situation. For instance, bots assigned to the German edition had the fewest conflicts with each other, changing one another’s edit about 24 times. The Portuguese bots, on the other hand, had 185 conflicts and the English ones some 105. And this undoing would rage on for years, no less.

“We find that bots behave differently in different cultural environments and their conflicts are also very different to the ones between human editors. This has implications not only for how we design artificial agents but also for how we study them. We need more research into the sociology of bots,” said Dr Milena Tsvetkova, from the Oxford Internet Institute.

It is interesting to note just how these bots took on the cultural characteristics of their designers depending on each one’s nationality. These findings were published in the journal PLOS ONE and serve as more of a warning for future developers when they are working on bots used for cyber security, managing social media, and other similar jobs.

“The findings show that even the same technology leads to different outcomes depending on the cultural environment. An automated vehicle will drive differently on a German autobahn to how it will through the Tuscan hills of Italy. Similarly, the local online infrastructure that bots inhabit will have some bearing on how they behave and their performance.

Bots are designed by humans from different countries so when they encounter one another, this can lead to online clashes. We see differences in the technology used in the different Wikipedia language editions and the different cultures of the communities of Wikipedia editors involved create complicated interactions. This complexity is a fundamental feature that needs to be considered in any conversation related to automation and artificial intelligence,” Yasseri said.


Here are some related stories: