Marko Mazepa: UC Essay Competition Winner 2024

May 15, 2024 Marko Mazepa
Marko Mazepa: UC Essay Competition Winner 2024

Read the essay of AUBG student Marko Mazepa who won first prize in this year’s University Council Competition. The topic for 2024 was “Disinformation Frontier: The Responsibility of Individuals”.

Introduction

Do you know what genuinely makes me anxious? Perhaps, I passionately believe something that is false, because chances are that I do – and I don’t even know it. What if I have already defended any disinformation? What if I have convinced others to believe it? Or worse, what if I have caused serious harm to others with it?  Discovering false beliefs that I once held as unequivocally true shakes me to the very core. And so, I strive to take anything I watch, hear, or read with a grain of salt.

Are you anxious about disinformation influencing your beliefs, too? Whether we are aware or not, we have probably engaged in spreading or even crafting disinformation in our communities. This disinformation could slightly skew our beliefs about democracy, cultures, history, etc. Or, it could cascade into making uninformed decisions based on our skewed representation of reality.

So, what if each individual takes responsibility for curbing disinformation? Then, we will fact-check our beliefs, and address our anxieties, so that our critical thinking prevails over comforting lies. By doing so, we will ensure more informed decisions in politics, business, and personal life.

What is disinformation and misinformation?

To clear up the definition of disinformation, consider this hypothetical: ‘BREAKING NEWS. An information leakage exposed Governor John Doe’s contact with extraterrestrial creatures. The leaked document outlined a top-secret meeting of the Governor with a group of alien diplomats aboard a flying saucer. Soon after the meeting, he concealed his identity and fled from the authorities. He was last detected during his vain attempt to access the President’s nuclear codes remotely. His whereabouts remain unknown to this day. However, one statement is undeniable: John Doe is a national threat on the run.’

Would any of us believe this news to be true? Likely not, because we possess critical thinking skills to rely on facts before fiction: whether it be aliens, top-secret files, or nuclear codes.

Yet, if this hypothetical spreads from a sketchy news website to other media, would we believe it? Perhaps, a TV channel picks up on it and calls in an expert to comment on the imminent alien invasion. An influencer recreates it as a short video and it goes viral. Or Governor John Doe reads it and announces that he would rather work with aliens than the current administration. Boom – and it’s all over the media. Now even our neighbors might advise us to withdraw money from the bank to invest in a doomsday bunker before the aliens arrive. For a brief moment, we might consider that some part of all this is now so mainstream that it should be true.

Sounds absurd, right? “One column, two sentences, a headline. Then it all vanishes in mid-air.” In Fahrenheit 451, Ray Bradbury had already predicted the light speed at which such absurdities could spread in media.[1] The speed at which false information reaches millions of people in mere seconds. Effectively, the speed at which we are unable to consistently distinguish between a truth and a lie.

Yes, I did use the word lie, because any lie implies intention: a choice to deceive somebody else. In the hypothetical, however, it is unclear whether the TV channel, the influencer, and the Governor intend to spread false information. If these media outlets spread it intentionally, they would be considered sources of disinformation: false information disseminated intentionally to harm people, institutions, and interests. If this spread was unintentional, the media would be simply disinformed by the initial source. In other words, they would engage in misinformation: false information, but not created with the intention of causing harm.[2] Hence, there is a conceptual difference between misinformed individuals and disinformation agents.

The extent of disinformation in the media

Now, hold on: would any reputable media misinform its audience with sensational news about aliens and flying saucers? Surprisingly, yes, to the extent that its journalists prioritize quantity over quality news reporting. And don’t get me wrong. Journalists should be professionals who share a common interest in providing fact-checked news for their readers. But they also experience competitive pressure for readers’ attention in a saturated market where sensations sell well irrespective of their source.[3] In the flurry of this attention economy, the faster journalists pump out sensations, the more likely they are to keep their jobs. Consequently, journalists are anxious to be the first to report rather than tell the true story. This anxiety has made them more susceptible to cascading my hypothetical into a real obsession with extraterrestrial creatures in The New York Times,[4] BBC,[5] and The Guardian.[6] As a result of such cascades, the cycle of disinformation and misinformation continues to penetrate our daily media interactions whether sensations are hypothetical or not.

Of course, these interactions have progressed toward more responsible media practices over time. In the 1970s the Soviet Union formally weaponized disinformation as a propagandistic strategy in the journalistic code of conduct. In the 2020s independent media outlets (such as WhoWhatWhy and Open Secrets) ensure quality civic discourse with fact-checking tools: internal content moderation, newsroom social media guidelines, and information audits by third-party organizations. The implementation of these tools has been a positive step toward curbing disinformation in independent journalism.

Possible solutions to disinformation

Nevertheless, journalists cannot bear the responsibility of safeguarding all communication channels. Simply put: demanding journalists to excessively fact-check every piece of information would exert too much pressure on their already anxiety-inducing schedules. Because communication channels are reciprocal, who could join journalists to curb disinformation more effectively?

Perhaps, the solution lies in engaging media audiences with information literacy training, developing tech companies with cybersecurity innovations, or encouraging states with data protection acts. Well, the short answer is all of the above.

The long answer, however, circles back to each one of us interacting with communication channels daily. After all, be it states or tech companies, their members are individuals. No wonder. Disinformation penetrates media outlets much the same way as states and companies: by exploiting individuals’ anxieties within their respective communication channels. As a result, if we learn how disinformation exploits anxieties to manipulate us, we will fact-check to make more informed decisions.

Ineffective state policies in curbing disinformation

Now, hold on again. Okay. We have established that journalists should not be the only individuals responsible for curbing disinformation. So, why can’t state institutions step in and get rid of disinformation agents once and for all? States should represent the collective will of the people, hence state institutions should be inherently more powerful than individuals.

While this statement is partially true, don’t get misinformed. If all states accumulated their power to curb disinformation, perhaps, we would live in a utopia where truth is sacred and lies are condemned. Instead of this utopia, we experience just the opposite. The world in which at least 81 states have already created disinformation agents to deploy against their enemy states as of 2020.[7] With some regional exceptions of legislative acts within the EU, states have utilized disinformation as a weapon of global propaganda so far.

Both democratic and authoritarian states have used this weapon to effectively advance their political agendas. Be it Vladimir Putin, Joe Biden, or Xi Jinping, none of these heads of state are concerned with disinformation unless it attempts to remove them from power directly. And if one head of state does spread disinformation against other heads of state, those who disagree could simply be accused of disinformation in reciprocity.

Sounds weird? Well, it is not. Consider Donald Trump. Over the years he made derogatory comments aimed at objectifying women. Every time the media would call him out for such comments. Then, he would make a public apology to calm the storm of accusations against him. Once his apology was washed away with the public, he would accuse the media of faking both his comments and subsequent apology. Meanwhile, he would play-pretend to be the hero for saving his electorate from disinformation in the media. Rinse and repeat – and Trump was in the clear again. In this case, the state official’s voice is unproportionally more persuasive than the journalists’ voices because of his vast network of disinformation agents.[8] Perhaps, now my hypothetical does not sound as alien as this disinformation strategy to you anymore.

Grassroots initiatives improving states’ policies on disinformation

In politics, however, we can delegitimize the abuse of the disinformation agents by developing critical thinking among individuals from the grassroots. In the 2020 US election season, for example, Donald Trump and Republicans ultimately lost their authority over constant disinformation to the public. They lost both the Presidency and the Senate’s majority.[9] This logic applies to other liberal democracies. If any democratically elected government abuses individuals’ anxieties for electoral goals, it always risks becoming anxious about snap elections. Furthermore, the same logic can be adapted to more authoritarian regimes. Former regions within the Soviet Union mobilized to declare independence because the Soviet disinformation agents could not cover the fictitious overperformance of five-year plans anymore. Although authoritarian states might require more time for public mobilization, this is the proof of concept. Regardless of the regime, individuals can curb disinformation with critical thinking to jumpstart states’ policies from grassroots initiatives.

For example, here are some steps we can take toward integrating such grassroots initiatives into political agendas from outside of state institutions. Perhaps, we could start with a small step: join an information literacy workshop to learn about the latest disinformation tools, such as deep fakes. We could come once and update our knowledge with workshops on new developments every couple of months. Or we can take a bigger step: join a fact-checking organization and monitor local news for disinformation in our communities. Whatever step we take, it will help us hold politicians responsible for their words.

In the European Union, for example, such individual steps have already accumulated enough internal pressure to introduce the Digital Services Act (DSA) and Digital Markets Act (DMA). Voices of fact-checking organizations, social activists, and information literacy experts were eventually heard over the yammering of dishonest politicians. Adopted in 2023, both legislative acts introduced a framework for taking down disinformation online. We have yet to observe how EU member states will implement these acts.[10] Nevertheless, one trend has become clear. The more individuals actively engage in curbing disinformation, the more likely we are to elevate grassroots initiatives to the top of political agendas globally.

Indeed, this vision is not without shortcomings. There is always a chance that states will become increasingly entrenched with ideas of weaponizing disinformation in waging violent war with other states. Disinformation can be a weapon to entice violence and jumpstart full-scale wars. Consider the ongoing Israeli-Palestinian conflict in Gaza. Back in 2014, newspapers, TV broadcasts, and state press releases all warned against the so-called day of rage. All three communication channels disinformed the mass public about possible Islamic terrorist attacks on the day when West Bank Palestinians initiated a peaceful protest in 2014. Feeding into Islamophobic anxieties after 9/11, this adversarial narrative dominated information flows for weeks around the world.[11] Ten years from then, we can only imagine how many more such anxieties are exploited to entice violence between Israelis and Palestinians. Given such power of disinformation to promote violence between states, this grassroots engagement is simply a more secure alternative to ignorance and self-censorship.

Alternatively, imagine that states force-feed us with sensational content in our communication channels. We would be the version of Neo from The Matrix that chose to stay in the simulation.[12] In this scenario, anything that would challenge our critical thinking would be downright painful. People would censor themselves from any challenging ideas.

Recalling Ray Bradbury’s warning about the ever-growing speed of disinformation, this simulation is not that far from reality. Sound bites, factoids, and political drama have already overwhelmed us with anxieties blurring out critical thinking. There is a distinction between this simulation and reality, though. Reading sensational content by force does not necessarily mean that states condition us to be more susceptible to disinformation. More likely, states abuse our anxieties through communication channels. By doing so, they attempt to fix our attention on sensations full of comforting lies. It follows that if we are aware of these manipulations, we will curb state-initiated disinformation more effectively.

The potential of social entrepreneurs in curbing disinformation

Having established that states are central disinformation agents feeding on our anxieties, who else can help us withstand their manipulations but ourselves? Tech companies, to the extent that their executives and employees nourish social entrepreneurship spirit. If tech companies prioritize profit maximization and minimum law compliance only, they will have no interest in curbing disinformation. However, tech-savvy individuals within their companies think less corporate. More and more tech geniuses leave giants of the information industry such as Meta or OpenAI. Instead, they use their expertise to set up shop as independent cybersecurity firms. And so, they develop technologies to automate content moderation of disinformation agents on social media, Google ads, or digital newspapers. With open access to AI, they also develop digital markers that ping and track AI-generated disinformation on the Internet, including deep fakes, robocalls, or chatbots. While these technologies still guarantee tech companies’ profits,  there is an inherent social benefit in their advancements.

Think about tech companies’ potential in the disinformation arms race whenever our anxieties and critical thinking can fail us. Today, every tech giant talks that AI is our future. I disagree. After all, what is AI if it is for commercial use only? Another potential source of disinformation. Cybersecurity firms are far more important in the advancement of tools to navigate communication channels in the woods of falsehoods. Recognizing the light speed at which disinformation is evolving, we should encourage more and more tech-savvy individuals to become social entrepreneurs in the realm of cybersecurity. In this case, perhaps, we will have an accessible toolkit for curbing disinformation shortly.

Conclusions

Ultimately, there is no silver bullet for disinformation. At least, not yet. In a world where any communication channels could spread and craft falsehoods, we are responsible for curbing disinformation more effectively. For this purpose, we might consider creating stricter journalistic norms, implementing more laws governing digital space, or developing revolutionary technologies.

These steps toward a better information landscape are noble. Certainly, they will help us make more informed decisions. But, these steps can take decades to influence the world on a global scale. So, how can you be responsible for curbing disinformation here and now?

Ask questions. A lot of questions. Just like I posed many questions while writing this essay, ask yourself: have you encountered disinformation in the last seven days? Perhaps, you watched a deepfake on social media. You picked up a phone and a human-like voice told you something sensational. Or you read the latest news about aliens and flying saucers.

Likely, you did not trust any of these disinformation agents for a second. But what if you went ahead and fact-checked them for real? What if you rallied non-profit organizations to lobby for laws counteracting these disinformation agents? Or what if you invented a new tech to identify disinformation?

So many questions, and so few responses.

Our future depends on whether we dare to answer them or not.

Citations

[1] Ray Bradbury, Fahrenheit 451 (New York: Simon & Schuster, 2013).

[2] Adapted definition from United Nations, “Our Common Agenda Policy Brief: Information Integrity on Digital Platforms,” 2023, https://unsos.unmissions.org/sites/default/files/our-common-agenda-policy-brief-information-integrity-en.pdf; The Internet Governance Project, “On Disinformation: Adopting a Narrow Definition,” 2023, https://www.ohchr.org/sites/default/files/Documents/Issues/Expression/disinformation/3-Academics/Georgia-Institute-of-Technology-School-of-Public-Policy.pdf

[3] Stanford University, Stanford, and California 94305, “How to Responsibly Report on Hacks and Disinformation,” cyber.fsi.stanford.edu, 2023, https://cyber.fsi.stanford.edu/content/how-responsibly-report-hacks-and-disinformation.

[4] Mark Bulik, “1947: Flying Saucers Land in the Times (Published 2015),” The New York Times, August 10, 2015, sec. Times Insider, https://www.nytimes.com/2015/08/10/insider/1947-flying-saucers-land-in-the-times.html.

[5] Zaria Gorvett, “The UFO Reports Piquing Nasa’s Interest,” www.bbc.com, 2023, https://www.bbc.com/future/article/20230726-the-weird-incidents-piquing-nasas-interest.

[6] Luc Torres, “Looking Back: UFOs,” The Guardian, April 30, 2018, sec. News, https://www.theguardian.com/news/2018/apr/30/looking-back-ufos.

[7] Samantha Bradshaw, “Industrialized Disinformation 2020 Global Inventory of Organized Social Media Manipulation,” 2020, https://demtech.oii.ox.ac.uk/wp-content/uploads/sites/12/2021/02/CyberTroop-Report20-Draft9.pdf.

[8] Michael Barbaro and Megan Twohey, “Crossing the Line: How Donald Trump Behaved with Women in Private (Published 2016),” The New York Times, May 14, 2016, sec. U.S., https://www.nytimes.com/2016/05/15/us/politics/donald-trump-women.html?ref=politics.

[9] Niko Kommenda et al., “US Election Results 2020: Joe Biden’s Defeat of Donald Trump,” The Guardian, 2020, https://www.theguardian.com/us-news/ng-interactive/2020/dec/08/us-election-results-2020-joe-biden-defeats-donald-trump-to-win-presidency.

[10] European Commission, “The Digital Services Act Package | Shaping Europe’s Digital Future,” digital-strategy.ec.europa.eu (European Commission, 2022), https://digital-strategy.ec.europa.eu/en/policies/digital-services-act-package.

[11] Max Fisher, “The Strange History of ‘Days of Rage,’” Vox, July 25, 2014, https://www.vox.com/2014/7/25/5936655/days-of-rage-the-strange-history-of-the-term-palestinians-use-for.

[12] Andy Wachowski, Larry Wachowski, Don Davis, Don Davis, and Don Davis. The Matrix. USA/Australia, 1999.