AI, Democracy, and Truth: From WhatsApp Aunties to Synthetic Realities in the Digital Age
An exploration of our relationship with truth and trust in the information age
yoruba aunties at the 2023 ojude oba. piece created by the talented mayowa alabi.
The WhatsApp Auntie
Any child of African parents is acutely aware of the phenomenon we call, “the WhatsApp auntie”. The WhatsApp auntie, often a woman from your parent's generation (not necessarily biologically related), gathers and spreads (mis)information primarily via WhatsApp. Our focus lies not solely on the medium itself but rather on the content being disseminated. It ranges from lengthy text messages spreading propaganda about current affairs like the conspiracy theory that linked the spread of the coronavirus to 5G to poorly edited videos showcasing Jesus ascending into heaven. These chain messages do their rounds on the app with thousands of people forwarding them every minute. To those with higher levels of awareness, these WhatsApp messages are a laughable moment, allowing them to poke fun at the gullibility of others. However, to those who may not be aware of the fact that they are consuming and spreading misinformation, these forwards represent a duty to heed a warning to others.
WhatsApp is not the only social media notorious for the spread of misinformation. X, Facebook, and Instagram also fall victim to this occurrence. However, the difference with WhatsApp is the concept of trust. A study shows that the action of sharing and forwarding messages via WhatsApp is one of trust whereby people are less likely to confirm the truth value of a piece of content before resending it if it comes from someone in their contact book.
AI and Democracy
2024 is a landmark year for the number of countries going to the polls globally. At least 64 countries (plus the European Union) are meant to hold national elections. This represents a whopping 49% of the global population. As is expected in election seasons, political propaganda is likely to run high. AI has the potential to exacerbate the spread of the misinformation that has become commonplace during election seasons.
First and foremost, deepfake technology holds the highest potential for disinformation. As AI’s prevalence increases, it becomes easier and more affordable to create a deepfake video of any character; making them do and say whatever it is you desire. Cloning a voice used to cost approximately $10,000 and now only costs a few bucks. This provides an opportunity for both state and non-state actors to create images, audio, and videos painting political candidates in a certain light to skew their public perception. This could hold true, especially for countries with pluralist democracies whereby rather than a single group holding power, there exist interest groups that compete to influence policy. These interest groups are self-selected by their members, meaning anyone has the autonomy to join an interest group based on its alignment with their values. AI-generated deepfakes could, in this way become a tool for polarization whereby opposing interest groups create synthetic media content of leaders known to belong to particular interest groups to manipulate the masses.
Secondly, AI’s reach in the space of politics goes far beyond synthetic media. Phishing attacks also pose a significant threat to election systems. Phishing as a practice has managed to evolve with the advent of AI since new tools allow for more widespread attacks on election systems with fewer resources. The use of large language models has streamlined the creation of phishing emails, drastically reducing the human involvement needed in the process. This efficiency enables hackers to deploy generative AI, sending out deceptive emails faster, on a larger scale, and without grammatical errors. These deceptive emails could contain content set to alter public opinion, gather confidential data, or even manipulate voter records - all actions that compromise electoral safety and consequently, democracy.
Trust, Truth, and the Media
Election years are times when vulnerabilities run high. People are more susceptible to information because they are either looking to confirm their pre-existing beliefs or seeking out new information to base their voter choice on. The media in this way plays a critical role in electoral outcomes within any democratic society. It exists as the voters’ playing field of information. As times have changed, we’ve made the shift from traditional media (newspapers, books, television, radio) to digital media (social media platforms like WhatsApp, X, Instagram, blogs, websites). With this shift has come an exponential increase in the speed and reach of forwarded content. Quality control, however, poses a challenge. Digital media empowers individuals to freely share opinions, gaining rapid traction and influencing decisions, diverging from the centralized control seen in traditional media.
Expanding on the connection between the WhatsApp auntie phenomenon and the evolving landscape of synthetic media, we can observe a shift in the challenges posed to our relationship with truth. As technology advances, distinguishing between genuine and manipulated content becomes increasingly challenging. While the laughable nature of WhatsApp aunties' content may currently amuse, the emergence of hyper-realistic synthetic media introduces a dangerous ambiguity. It increases the chances of us sharing content that we believe to be genuine, especially when it seems controversial. A great example is this video created by Sputnik, where Biden, Zelensky, and Putin “wrap up” 2023. Without the explicit statement from the organization that the video is a parody, it’s quite easy to be deceived into believing that it’s real. Even audio and audio-visual media that were believed nearly impossible to alter are no longer that way. What is the truth?
image of Biden, Putin, and Zelensky from the Sputnik parody video
Truth is the foundation of trust in any form of relationship. In light of political campaigns, if we cannot believe the content we see, then how can we trust the system that produces it? When we lose trust in the system, we seek out information to bolster our distrust. This is how the WhatsApp auntie is created and fewer of us are immune than we might think. Since the media plays such a central role in communication between the government and the people, eroding the trust that the latter has in it could quickly lead to a slippery slope of political trust erosion on multiple fronts.
Regulation
An article I came across by Cyberscoop encapsulated my precise thoughts on the necessity for regulating socio-technologies in the political sphere. Specifically, the quote, “our systems of governance are not suited to our power level. They tend to be rights based, not permissions based. They’re designed to be reactive, because traditionally there was only so much damage a single person could do”. The double-edged sword that AI poses is that it empowers the individual. The role of regulation, therefore is crucial. The analogy in my mind regarding the regulation needed here is that of a leaking tap and pipe. Rather than resembling a singular leaking spout requiring attention only in one spot, it more closely resembles a perforated pipe, demanding multiple interventions along its length for effective mitigation.
First, lies the role of the private sector i.e. the social media companies. Through community notes, X (Twitter) has taken a great step towards tackling misinformation and disinformation by providing context to potentially deceptive posts. WhatsApp as well has included the “forwarded” and “forwarded many times” tag which albeit being misunderstood by a lot of users, is a step in the right direction as it seeks to inform users that they are not consuming “original” content. To allow regulation to be a continuous, iterative process, social media companies need to expand their trust and safety teams to increase vigilance. Second, is the role that the public sector plays in regulation. Regarding synthetic media, disclaimers can be broadcast on leaders’ social media accounts. However, beyond that, top-down regulation of the media is a delicate issue because of its potential to infringe on freedom of speech.
Media regulation and the preservation of truth must be treated with the necessary urgency given the number of global elections taking place this year. As scary as it may seem, being our first election season in the age of AI, it also brings with it the opportunity to be the necessary push for synthetic media regulation that could live long beyond the election season.