This issue of Robotrolling examines users suspended by Twitter. Contrary to expectation, most of the accounts were human-controlled accounts rather than bots. Since 2017, the speed at which Twitter suspended misbehaving users has by two measures almost doubled. However, removals of Russian-language accounts have been considerably slower than for English.
The speed of removal can be critical, for instance in the context of an election. The Latvian elections, conducted on 6 October 2018, passed with remarkably little Russian-language activity about the NATO presence in the country.
Our analyses show a movement in the past year away from automated manipulation to humans operating fake or disposable identities online. The figures published in this issue reflect the good work done to tackle bots, but show much work remains to tackle manipulation through fake human-controlled accounts.
Bots created 46% of Russian-language messaging about the NATO presence in the Baltics and Poland. More than 50% of Russian-language messaging about Estonia this quarter came from automated accounts.
Anonymous human-operated accounts posted 46% of all English-language messages about Poland, compared to 29% for the Baltic States. This discrepancy is both anomalous and persistent. Some of the messaging is probably artificial.
We continue to publish measures of fake social activity in the hope that quantifying the problem will focus minds on solving it.