Welcome

Welcome to the official publication of the St Andrews Foreign Affairs Society. Feel free to reach out to the editors at fareview@st-andrews.ac.uk

Acknowledging the Terms and Conditions: The Weaponization of Data

Acknowledging the Terms and Conditions: The Weaponization of Data

In an era of being able to ask your “Alexa” virtually anything, the question arises as to who Alexa actually is and how her answers are being generated. The vast majority of people spend hours a day browsing social media and interacting with thousands of profiles on the internet. Many of these profiles, however, are not in fact people but autonomous programs known as “bots” which operate on algorithms and interact with regular users. While bots were originally created as simple computer scripts which could complete mundane tasks, they are now able to operate and harvest large numbers of information. Furthermore, bots and artificial intelligence machines have started to employ big data and automation to spread propaganda and distort information over the internet. 

In 2010, a computer programmer named Peter Warden was forced to delete years of research through which he harvested data from 210 million Facebook profiles. Facebook later started a series of updates on its application program interface (API) to continuously restrict third-party applications from accessing its user’s data, which became altogether prohibited by 2014. Applications such as R and Python, however, still have tools which are used specifically for the purpose of gaining access to the API and Facebook user’s data.

In looking at the rising presence of bots, they have been found to harvest big data information in order to operate large numbers of fake social media accounts. The Computational Propaganda Project research has determined that bots are responsible for harvesting data and broadcasting extremist viewpoints by interacting with real people, like us, over the internet. 

In early 2018, news came out about the Facebook-Cambridge Analytica scandal in which Cambridge Analytica, a company owned by the billionaire Robert Mercer, had harvested and used personal information to build a system which could profile US voters and target them with personalized political advertisements. Similar to the ways you can tell a great deal about someone based on the way they dress and behave in public spaces, much can also be determined about their data and interactions on virtual public spaces. In one of Facebook’s largest ever breaches of data, Cambridge Analytica was able to build a software program which was able to predict voter’s choices and in turn, create personalized advertisements to manipulate their thoughts via their interactions with “bots” on the internet. Cambridge Analytica harvested the data of 87 million Facebook users and has been accused of exploiting rhetoric that promotes colonialism, racial biases, toxic masculinity, and fat-shaming to undermine people and manipulate them. Artificial intelligence, in fact, has been commonly linked to perpetuating racial and gender biases based on the internalized racism and misogyny of the people who created its algorithms. This makes the use of artificial intelligence in bot detection algorithms even more difficult to identify hate speech and the spread of damaging political propaganda. 

This scandal which demonstrated the use of political attack ads was linked to Donald Trump’s campaign in the 2016 election. His campaign used the data of his constituency to psychologically profile voters and launch mass campaigns of political attack ads. RAND Corporation defined actions like these as “influence operations”, or “the collection of tactical information about an adversary as well as the dissemination of propaganda in pursuit of a competitive advantage over an opponent.” In the wake of this scandal, it was determined that there had also been Russian interference in using bots to spread propaganda across a variety of internet platforms. Particularly because of the involvement of Russia in this scandal, it has been widely observed as a staple of state warcraft and a radical departure from what many westerners understand as democracy. Many have called for bot bans and increased measures to protect cyber-space from fake profiles which psychologically manipulate social media users. 

In the context of the 2020 election, there is bad news regarding our ability to freely think as voters. Bots are becoming more human-like and increasingly capable of disguising themselves within the average social media users. Bots have continued to evolve and adopt more human-like characteristics which allows them to engage better with the communities they target. The use of human and conversational language and the ability to coordinate messages makes it increasingly difficult to identify bots on platforms such as Twitter and Facebook. Developers and hackers have refined the algorithms of bots to be able to automate conversation with users on the internet, which as Emilio Derrara points out, “further corroborates the idea that there is an arms race between bots and detection algorithms.”

What does this mean for the future of democracy? In the era of increasing levels of the use of artificial intelligence, cybersecurity experts have become invaluable in government positions. It is still extremely common for social bots to spread fake news, which means that it is increasingly important to not believe everything we read on the internet. Media has become inherently embedded in social life and informing public opinion, however with the absence of privacy and regulations, it is important to seek out credible sources when deciding on who to vote for. Though lying and deceit are historically embedded throughout the political process, as educated voters it is important that we continue to protect ourselves from the manipulation of our thoughts in an era of technological advancements. 

Cover Image source: https://en.wikipedia.org/wiki/Facebook#/media/File:Original-facebook.jpg

Nuclear weapons in Turkey: The ultimate bargaining chip

Nuclear weapons in Turkey: The ultimate bargaining chip

The Clothes On Our Backs And Their Impact

The Clothes On Our Backs And Their Impact