Civil Society: Gone to the Bots?

Civil society is defined by the London School of Economics as:

“Civil Society refers to the arena of uncoerced collective action around shared interests, purposes and values. In theory, its institutional forms are distinct from those of the state, family, and market, though in practice, the boundaries between state, civil society, family and market are often complex, blurred and negotiated. Civil society commonly embraces a diversity of spaces, actors and institutional forms, varying in their degree of formality, autonomy and power. Civil societies are often populated by organizations such as registered charities, development non-governmental organizations, community groups, women’s organizations, faith-based organizations, professional associations, trade unions, self-help groups, social movements, business associations, coalitions and advocacy groups”

I’d like to put the emphasis on “uncoerced collective action” in that definition, because I’d like to talk about how civil society might be challenged or at least threatened in an era where we organize using digital communication technologies.

Two Dalek robots in a museum
Meet your new Robot President!
Rise and Evolution of the Daleks. By Nelo Hotsuma. Available on Flickr: https://flic.kr/p/nY9z4c

Today, The Hill reported that Twitter has revealed to US lawmakers that Russian bots retweeted Trump almost 50,000 times during the final months of the 2016 US election. Of course, this one fact, doesn’t tell the completed story. Some people feel that this may even be a bit of a red herring. As pointed out succinctly by Philip Mai from the Social Media Lab, it hardly matters if bots retweeted Trump 50,000 times if nobody was following or retweeting the bots. On the other hand though, it is well known in both academic and PR circles that astroturfing, or creating fake grassroots movements online, can be an effective strategy for influencing public opinion as well as policy, and bots could add an additional layer of complication to this problem.

Astroturfing occurs when fake online profiles or groups are set up to create the illusion of widespread support for a candidate, issue, or policy. Some studies have shown this kind of action to be successful in sowing at the very least, uncertainty about an issue. For this purpose, bots could indeed be a very effective tool. In the context of Trump’s tweets for example. Even if the bots retweeting Trump were not followed by anyone else, the fact that a Trump tweet on a particular policy position or issue received a (potentially) much higher number of retweets than a tweet by his rival (as a result of bot activity) is important. Studies have suggested that RT’s can be considered one indication of Tweet credibility. In this case, it doesn’t matter whether or not bots directly influenced anyone. Instead, the number of retweet’s alone is what could be influential – it indicates consensus or widespread agreement where it does not necessarily exist.

So this is where I return to civil society. If bots are used in the process of astroturfing like activity, then those being misled into thinking there is widespread support for a policy, issue, or public figure, are in a sense being coerced into their collective action – or at the very least are acting based on misleading information. If civil society has “gone to the bots” it is thus potentially challenged, even in situations where nobody listens to the bots directly. It’s not clear cut, and it’s certainly not all doom and gloom, but those of us interested in real public engagement would do well to remain skeptical about the online engagement we see as it might be engineered in many different ways.

Civil Society: Gone to the Bots?

Leave a Reply

Your email address will not be published. Required fields are marked *