Image for Keeping Truth Alive In A World Of Anonymous Wikipedia Edits
Avatar image of Meghan Sali

Keeping Truth Alive In A World Of Anonymous Wikipedia Edits

Writing in the Huffington Post, our own Meghan Sali looks into the implications of allowing everyone to make real-time edits to Wikipedia, including government officials

This piece by our Meghan Sali was originally published by the Huffington Post

In a context where "fake news" is catching the attention of everyone from U.S. President Trump to German Chancellor Angela Merkel, the pursuit of truth, and who defines it, seems to matter more than ever. 

One frontline in the battle over facts is playing out in the public arena of Wikipedia, where history is catalogued in real-time and where each of us have the opportunity to act as historians, contributing to editorial decisions. 

But what happens when government officials take to the web to edit this public resource? And what are the implications of allowing elected officials and bureaucrats to shape the narrative -- often without the knowledge of the public?

Although university professors may disagree, Wikipedia is viewed by many as a legitimate source of information -- with more than 5.3 million articles, it adds 800 new articles every day. Wikipedia operates on the basis that anyone can edit its pages -- even unregistered users -- and edits happen on a mass scale. At a rate of over 10 edits per second, the mind starts to boggle in coming to terms with what it means for the public as we try to keep truth on the table. 

Wikipedia's model is a positive example of the power of crowdsourcing -- a place where, with proper sourcing, anyone can provide additional clarity or add new information to articles. It's what gives this medium a real advantage over legacy encyclopedias -- Wikipedia can be edited and updated on the fly, and changes with the changing world around us. 

Of course, this can be alarming when it comes to individuals or groups editing pages in which they have a personal, professional, or political interest -- including members of government departments as a part of re-information campaigns. 

It's enough of a concern that efforts to catalogue and publicize potentially-controversial edits have spontaneously arisen. The best examples are Twitter bots that create a public inventory of anonymous edits from known government IP addresses -- since 2014 bots for the Government of CanadaUK Parliament, and U.S. Congresshave appeared. 

Examples range from the perplexing -- recently, an anonymous edit was made by a Canadian Department of National Defence IP address on the article listing Pepsi variations -- to the concerning. One such example was highlighted by The Tyee journalist Jeremy Nuttall, showing that someone at a government IP address had anonymously edited a page about the political magazine Blacklock's Reporter, which currently has an ongoing legal dispute with the Canadian Department of Finance. 

"How do we ensure that our digital book of knowledge maintains its standards for accuracy and neutrality?"

Not only has the Internet community noticed, but Wikipedia itself has taken action to combat problematic editing -- in one case briefly restricting U.S. Congress IP addresses from making edits, but also through implementing clear policies around "conflict of interest editing", and regularly banning accounts that violate the rules. 

When so many people view Wikipedia as a public record, the threat is obvious, especially when it comes to governments looking to "correct" that public record. And although we may have Twitter bots to shine a light on edits that happen from known government addresses, there is nothing to stop those same staffers from going home to their computers and typing up a storm. 

So how do we ensure that our digital book of knowledge maintains its standards for accuracy and neutrality? Like the evolving conversation on fake news, the answer isn't simple, and involves a lot of individual vigilance to keep us honest. 

Wikipedia is already home to thousands of admins and hundreds of thousands of active users. Admins and editors are not paid -- a policy that is intended to keep money from changing hands in exchange for favourable or biased articles. This team catches the majority of obviously problematic edits and there are tools for individuals to flag bias or content that is improperly sourced. In this way, part of the solution comes down to trust and community.

But because we know that this process can be abused, it's critical that there is enforceable internal policy restricting staffers, bureaucrats, and government officials from making edits to Wikipedia pages in which they have a clear conflict of interest. 

This move would be to the benefit of both the public and the government, as the mere appearance of impropriety can undermine trust in our democratic institutions. This way, official communication on issues of public interest happen out in the open, where debate and discussion are welcome, and individuals can be assured the conversation isn't being steered by an invisible hand. 

The final piece of the puzzle is good, old-fashioned sunlight. Initiatives like Twitter bots and investigative journalism fill the gap where good faith fails -- opening the door for citizens to be critical about actions taken by their governments.


TOPICS
Take action now! Sign up to be in the loop Donate to support our work