1
Shares
Pinterest Google+

How did Trump win?

While there are many reasons behind his win, he was helped by thousands of automated social media accounts called botnets. Many of these pro-Trump social media accounts were built with a different mission. They were once pro-Putin bots (Schreckinger et all 2016).

By late 2015, many pro-Putin bot accounts began aggressively supporting Donald Trump. On 6 January 2017 a report by the US director of National intelligence “Background to ‘Assessing Russian Activities and Intentions in Recent US Elections’: The Analytic Process and Cyber Incident Attribution” concluded:

We assess Russian President Vladimir Putin ordered an influence campaign in 2016 aimed at the US presidential election. Russia’s goals were to undermine public faith in the US democratic process, denigrate Secretary Clinton, and harm her electability and potential presidency. We further assess Putin and the Russian Government developed a clear preference for President-elect Trump. We have high confidence in these judgments.

To understand what is happening with fake news, we need to go back to the origins of the bots. Origins that go back to Russia, with the strategic doctrine of Maskirova.

Major General Alexander Vladimirov, vice-president of Russia’s Collegium of Military Experts: “When he [Man] began hunting, he had to paint himself different colours to avoid being eaten by a tiger. From that point on Maskirovka was a part of his life. All human history can be portrayed as the history of deception.” (Ash 2015). Maskirova has been a Russian military strategy since at least 1380, used in the battle of Kulikovo field, Jassy-Kishinev and Operation Bagration. More recently, Putin used Maskirova as a tool in Crimea, initially not admitting that the troops occupying the region were Russia. In eastern Ukraine, Putin continues to deny Russian support for the rebels. Since at least 2011 Maskirova has moved to the digital realm, where it continues as a successful strategy.

Botnets, through the creation of fake social media accounts, can make unpopular positions look like they have strong support and hinder social movements through confusion. For $5, one can buy 4,000 bot followers for a Twitter account through applications like automatic Twitter bots, making one’s positions seem more popular than they are in reality (Bilton 2014). Bots can make a product or issue seem popular by generating fake interest by fake profiles (Goldman 2013).

Through botnet Maskirova, Russian botnets produce uncertainty, and impede organisation by anti-government forces. During the 2011 protests Russian government botnets drowned out the dissident hashtag #триумфальная (Triumfalnaya), limiting mobilisation around the term. The bots filled the hashtag with anti-protestor and pro-Putin sentiment (Krebs 2011). In the same elections, botnets created the illusion of large support behind the pro-Putin Nishi youth movement (Fredheim 2014). More recently, pro-Putin bots filled the comment’s sections of The Guardian with Pro-Putin posts, creating the illusion of large-scale dissent with their stated positions (Himler 2014). Not limited to The Guardian’s comment sections, botnets such as the euphemistically named St. Petersburg based Internet Research Agency spread pro-Putin propaganda on comment sections on The Blaze, Fox News, Huffington Post, Politico, and WorldNet Daily (Seddon 2014). Accused of bias by the bots, media sites attacked by Russian botnets lose credibility and the ability to unify public opinion. Using botnets, Putin continues to use Maskirova as a tool in reaching his strategic objectives.

His latest strategic objective?

A friendly US president. Already, Trump has promised closer relations, a lifting of the embargo, and recognition of Crimea as part of Russia.

Botnets and Maskirova doctrine have succeeded spectacularly.

Author

Previous post

How did the Islamic State come to be?

Next post

WE ARE RECRUITING: Writers, journalists, reporters, cartoonists