When networks’ discourse –whether in the marketplace, or online— runs counter to elites’ interests, a backlash can always be expected. Networks that pose a risk need to be fragmented, to prevent threatening citizen organisation.
In responding to online networks, some accountability-wary states have opted for heavy-handed measures: the ongoing internet shutdown in Kashmir, or the blocking of internet access before the recent Ugandan elections. Others have pursued subtler methods. Twitter, often looked to for mobilisation and relatively transparent to researchers, offers a useful window into the tug-of-war between activists and state agents.
A 2017 report –Troops, Trolls and Troublemakers— by the Oxford Internet Institute (OII) found that:
The OII’s 2020 report showed the proliferation of these practices in its title: Industrialised Disinformation. 81 countries, they find, now engage in computational propaganda, and the figure is rising steadily year-on-year.
"Like a virus, disinformation actors have evolved, and ‘cyborgs’ have become increasingly prevalent"
This takes different forms. One common tactic is the use of bots, swarms of algorithm-driven social accounts posting pro-government talking points or attacking opposition voices. Relatively easily identified by Twitter, however, these have become less effective. Like a virus, disinformation actors have evolved, and ‘cyborgs’ have become increasingly prevalent: groups of bot accounts controlled by humans who take the reins when those bots spark interactions (usually angry). In Mexico, these came to be known as ‘Peñabots’ during the 2012 election of Enrique Peña Nieto. Despite requests from civil society organisations for an investigation into the possible use of government funds in running these programmes, they are still noticeably active.
Bot or cyborg brigades aren’t always run directly by states. Sometimes states use in-house propaganda units, but increasingly these activities are sub-contracted out to communications firms, or even simply to freelancers— for students, for instance, the money for next month’s rent may take priority over keeping moral values intact online.
For a point to be expressed audibly on social media, a critical mass of unified ‘voices’ needs to be reached: at that point, the idea starts to appear on the information feeds of those not initially interested, reaching them through weak ties connecting the online communities originally pushing the ideas with those more peripheral to them. At a point, this becomes self-perpetuating; when people feel part of a larger movement, they are more likely to remain involved, and those not yet involved are more likely to join. That requires voices to remain raised for enough time, reinforcing the solidarity of protesting individuals and keeping the community tightly-networked.
Firstly, they create competing trends: as members of centrally-coordinated networks, they can post content in a strategic way, pushing subjects to greater prominence through Twitter’s algorithms. In Mexico, already one of the most dangerous countries in the world to be a journalist, these artificially-accelerated trends have in the past slandered free media actors, or threatened independent journalists with death.
Secondly, thus, online troll mobilisation has real-world consequences: journalists barraged by tweets or accusations have less time to conduct research or talk with others interested in accountability. Trolls may also force journalists out of the game altogether, exhausting them by abuse and threats.
Thirdly, trolls sap the energy from protest networks. Each fruitless engagement with a troll is an engagement that didn’t take place with a fellow protesting voice, and an opportunity for growing dissenting solidarity that was missed.
Candles at a vigil for Javier Valdez Cárdenas, a journalist reporting on organised crime who was murdered in 2017. Image: Creative Commons.
It’s very hard to combat these approaches. Those campaigning for change are often those who are already marginalised, and activists can’t afford their own big-budget communications firms.
Civil society activists have a further disadvantage. Those willing to use bots and trolls typically don’t hold much regard for the truth. Pro-accountability actors, however, have to be truthful if they are to have any success. False information, unfortunately, spreads far faster than the truth— six times faster, according to an MIT study. It can be tailored to the audience’s interest, and having been boosted by networks of fake accounts online, can rush far and fast into networks of real people. The often-used tool of fact-checking clumps leaden-footedly in disinformation’s wake.
In combatting coordinated disinformation within states, it is therefore key to not simply react, but to lead the discourse. In Mexico, this approach is exemplified by the work of Alberto Escorcia, a coordinator of online opposition voices. He monitors both troll traffic and citizens’ voices, unpicking what is on whose minds.
Analysis of social networks' interactions can allow activists to spot windows of opportunity forming from a distance. Image by Martin Grandjean/Creative Commons.
Competing with cyborg and bot actors, which can shape trends by generating thousands of posts a day, requires considerable nous. Escorcia and other Mexican activists have learned not to waste valuable time and energy by overly engaging with bot talking points. Instead, they take note of genuine discourse surfacing in actual conversations, and tailor their message to what they know is already sparking interest. Sometimes this allows a considerable advanced warning: talk of ‘police violence’, or ‘mayoral corruption’, may begin bubbling some time before it coalesces into potential on- and offline action.
"Savvy use of network analysis can give activists an unprecedented ability to spot a possible ‘window of opportunity’ from a distance, and to prepare themselves to lead the narrative once it arrives"
Realising what people care about at a particular moment, and creating activist messages empathetically, helps Escorcia and colleagues to cut through the drone of troll army distractions. Savvy use of network analysis can give activists an unprecedented ability to spot a possible ‘window of opportunity’ from a distance, and to prepare themselves to lead the narrative once it arrives. Other actors, such as Operation Libero in Switzerland, have used similar approaches across multiple media. It is not enough to react; beating state-mobilised trolls requires proactive messaging which can reinforce discussion networks before paid voices can drown them out.
Of course, states’ use of bot armies still tilts the playing field unfairly. This is hugely important: for accountability to exist and be effective, there needs to be genuine citizen contestation. Online muffling of organised citizen voice can only harm the legitimacy of the state. In the long term, better monitoring of online action by network providers like Twitter, and as great a reduction in political bot use as possible, is crucial. (Transparency and stakeholder participation within this process will also be key). In the shorter term, online activists everywhere can learn a lot from Escorcia:
This is one a series of blogs supported by the IDS alumni office and written by current IDS students and PhD Researchers from academic year 2020-2021 Spring Term.
I wrote this report during the first two months of the Covid19 pandemic. Many farm workers have died of Covid-19. Death rates among farm workers has been much higher compared to av… More...
Marisol (MADev14) has been profoundly affected by the way that Masehual women understand and relate to nature and envisa… More...
Shiqing Gong (MADev14) asks whether issues of women and poverty may have been missed during China's national poverty eli… More...
Ayushi Misra (MADev14) gives a personal perspective on the informal sector workers of India on whom a spotlight of attention fell when Covid-19 expose… More...
Current student Nandhini Jaishankar's (MADev13) looks at what the history of women in Afghanistan tells us about women's rights as Human Rights More...