Can bots influence elections with ‘megaphone effect’?

With speculation increasing of an election in Australia later this year, a new study has found Twitter bots can amplify tweets from candidates. But is anyone listening?

Dr Olga Boichak’s new research on computational propaganda – political bots manipulating public opinion on social media – has found bots are used to retweet candidates’ tweets over and over again to amplify the message. But this strategy does not always result in more people listening.


Key findings:

  • In the 2016 US sample, most of Donald Trump’s tweets and some of Hillary Clinton’s tweets were amplified using duplicate retweeting.
  • In the German sample, two of the leading politicians whose parties won seats in the German Bundestag, Alice Weidel’s and Sahra Wagenknecht, had tweets amplified by duplicate retweeters.
  • In the US election, social bots were not only amplifying, but also spreading candidates’ messages, helping them reach new audiences. This is where we assume human users might have been following bots. There is another study that confirms this pattern happening.
  • We did not find this to be true in the German election, as most social bots had zero followers.


“It’s a simple, heavy-handed amplification technique to create the ‘megaphone effect’,” said Dr Olga Boichak, a digital cultures expert in the Department of Media and Communications at University of Sydney.

The study found this amplifying or retweeting method had some influence in the US 2016 election where bot accounts had very high followers. This could be because some bots were following each other and creating a botnet – something that appears like a network of human users.

Or it could be that American Twitter users believed they were following a human instead of a bot and were more likely to trust it and start following the candidate.

A third scenario: some bots are actually useful information aggregators – as shown in our 2018 study. So, human users actually have an incentive to follow these bots as they aggregate useful information.

In the German 2017 election, however, the bots had far fewer or even zero followers so they were capable or retweeting but fewer people saw the tweets.

Dr Boichak said the findings concluded: “Although social bots increase the scale of retweets, they rarely spread information to new audiences.”

The study, published in the International Journal of Communication, focused on patterns of bot behaviour on Twitter.

window.twttr = window.twttr || (function (d, s, id) {
var js, fjs = d.getElementsByTagName(s)[0],
t = window.twttr || {};
if (d.getElementById(id)) return t;
js = d.createElement(s);
js.id = id;
js.src = “https://platform.twitter.com/widgets.js”;
fjs.parentNode.insertBefore(js, fjs);

t._e = [];
t.ready = function (f) {
t._e.push(f);
};

return t;
}(document, “script”, “twitter-wjs”));
(function (twttr) {
twttr.ready(function () {
twttr.widgets.createTweet(
“1358966521343537152”,
document.getElementById(“twitterEmbed_mNp9wAAh”), {
conversation: “all”,
cards: “visible”
}
);
});
}(window.twttr));

What is a bot?

A bot is an automated or semi-automated program that can mimic a human user. Some bots are helpful, such as weather updates or sports results. Others are malicious and can be used to spread disinformation or attempt to influence elections.

The researchers compared patterns of orchestrated (non-organic) tweet behaviour across two national elections: the 2016 Presidential Election in the United States, and the 2017 German federal election. “There have been claims of bot involvement in both of these elections, so we collected data to see what this involvement looked like. Were there bots involved in the German election, as well? (spoiler: they were). Yet, to what ends?” Dr Boichak said.

Previous research has mostly looked at the scale of retweet events – the so-called ‘megaphone effect’ – we know that bots can increase the prominence of a candidate’s message by retweeting it many times,” Dr Boichak said.

“But can social bots make a message go viral? Can they quickly spread the message to new audiences? In most cases, the answer is ‘no’,” she said. “However, we do know that once bots are followed and retweeted by humans, they can occupy more central positions within networks. This could allow them to distribute candidates’ messages among new audiences.”

cartoon image of a megaphone

Bots amplify information. Image: Pixabay

How a tweet gets noticed

Dr Boichak explained that after a political candidate posts a tweet, it may go unnoticed and get buried in the information flow. Or it may get retweeted and thus visible to new audiences. When this process of distribution happens, it is often the case that candidates gain new followers.

“So, we asked if a tweet gets retweeted only by bots who usually have few or no followers, would it be visible to new audiences? We show that it’s not always the case,” Dr Boichak said.

“This way, bots only amplify information – picture a political candidate yelling into a megaphone at a rally. But this ‘yelling’ isn’t as impactful when the stadium is empty. It is more effective when new audiences pick it up and spread it among their networks – this is diffusion.”

The researchers found that in both the US and German elections, bots were involved in duplicate retweeting: when a bot retweets the same tweet by a candidate many times.

“We noticed that was a pattern in both elections – it doesn’t seem like humans would do it on such a scale, but with bots – one would need fewer bots to get the job done,” Dr Boichak said.

“This strategy was part of ‘manufacturing consensus’ – making it seem that the public approved of a candidate’s messages.”

“This disconnect between scale and range in information events is a crucial indicator of orchestrated (by algorithmic or otherwise inauthentic agents) activity.”


Dr Olga Boichak is a sociologist of digital media and a Lecturer in Digital Cultures at the University of Sydney.

Want to learn more about Olga’s research? Read “Not the Bots You Are Looking For: Patterns and Effects of Orchestrated Interventions in the U.S. and German Elections


Declaration: This research received no funding and there is no conflict of interest to report.

/University Release. View in full here.