Tech Tyranny: Unveiling the dark side of digital democracy ahead of elections

By Dickson Ng'hily , The Guardian
Published at 10:46 AM Nov 21 2024
President Samia lines up to register for the 2024 local government election slated for November 27, this year.
Photo file
President Samia lines up to register for the 2024 local government election slated for November 27, this year.

AS Tanzania gears up for the 2024 local government elections on November 27 and general elections in 2025, a silent but powerful threat is rising—the manipulative use of digital technology. Tools once celebrated for fostering enlightenment are now weaponized, shaping beliefs, swaying votes and deepening societal divisions over Tanzania’s democratic landscape.

From the winding alleyways of Stone Town in Zanzibar to the shores of Lake Tanganyika, a troubling trend has emerged—technology, especially artificial intelligence (AI). This kind of technology is challenging the foundation of truth in the country. 

In an era when misinformation can spread with unprecedented speed, Tanzanians are facing a new kind of conflict, where the truth itself is under siege. Fake news and political propaganda masquerade as legitimate news, preying not only on the uninformed but also on even the most discerning citizens. Amidst this growing digital discord, the central question remains: can Tanzania's democracy withstand this wave of technological deceit?

Tanzania is at a critical point in its history, standing at the crossroads of technology and truth. The Guardian spoke with citizens, electoral experts and advocates of fair democratic practices, looking at ways Tanzanians can safeguard their democratic values and reclaim their voices, votes and future.

AI threat: A global concern with local impact

Artificial intelligence is transforming electoral dynamics, posing risks that extend far beyond disinformation. Mekela Panditharatne and Noah Giansiracusa from the Brennan Center for Justice argue that AI’s implications for elections remain underexplored, representing a silent but substantial danger.

The capabilities of AI have evolved significantly. Technologies such as Deepfakes enable the creation of realistic but fake images and voices, while large language models produce human-like content at scale. These tools, while beneficial in many areas, create vulnerabilities during elections, allowing for the rapid spread of disinformation. 

AI-powered chatbots built using large language models, such as ChatGPT, have altered the information ecosystem by producing content that is remarkably human-like

AI's impact on elections is most potent in its ability to generate disinformation and create viral ‘fake news’ moments. AI-generated images, videos and audio can rapidly circulate, warping public discourse with false scandals and manipulated narratives. 

During elections, this technology can be weaponized to sway public opinion subtly and systematically. Misleading posts and language models could create an illusion of political consensus or sow doubt about the electoral process, exacerbating distrust and division.

Tanzanians’ experience with misinformation

Citizens across Tanzania are grappling with the reality of digital manipulation in different ways. For many, misinformation has become a daily challenge, eroding their ability to distinguish fact from fiction. 

In Zanzibar’s Stone Town, fisherman Ahmed Bakari describes his struggles with identifying credible information, saying, “Distinguishing truth from lies isn’t easy anymore. I used to trust my instincts, but with political news, it’s getting tougher.”

Bakari’s experiences reflect the growing sense of disorientation among Tanzanians who once relied on trusted newspapers but now face an internet flooded with dubious stories. 

“People around me believe absurd headlines about candidates. We need someone to cut through the noise and provide clarity,” he says, appealing for a broader societal need to restore trust in the information that shapes civic engagement.

A trader, Fatma Juma, who operates in Darajani, Zanzibar, echoes Bakari’s concerns. “I see all kinds of stories on social media and it’s hard to tell what’s true.”

“It makes me uneasy. I think if there were lessons or tools to help people differentiate between fake and genuine news, we’d feel less manipulated.” 

For voters like Fatma, who want to participate in Tanzania’s democratic process with confidence, disinformation threatens not only individual choice but also collective faith in democracy.

In Bariadi town, Simiyu Region, Mariam Shelembi, a mother of four, is overwhelmed by the spread of falsehoods. “I recently saw a post about a political figure that was taken out of context. I wish there was a trusted organization to fact-check and explain these things,”

Her comments highlight the urgent need for reliable verification resources to protect voters from manipulation.

For first-time voter Hussein Jumbe, an 18-year-old student in Pemba, the challenge is even more daunting. “When I see something that seems off, I get confused, but who do I ask?” he wonders. Jumbe’s experiences reflect concerns of young voters who despite being digitally savvy; feel unprepared to navigate the complexities of disinformation.

Experts’ opinions

Some experts argue that the impact of AI on elections may be overstated. AI researchers Felix Simon, Keegan McBride, and Sacha Altay contend that while AI is used to influence elections, its effectiveness remains limited. 

A study conducted by the Alan Turing Institute, which analyzed over 100 elections, found evidence of AI interference in only a few cases, with no clear indications that these attempts influenced election results. 

However, the threat posed by AI manipulation is still significant. Meta’s recent Adversarial Threat Report acknowledges that AI has been deployed in election manipulation, noting that “GenAI-powered tactics provide only incremental productivity and content-generation gains.” Despite being subtle, these changes can shape political discourse and erode public trust, which, in emerging democracies like Tanzania, can have profound consequences.

A global survey on AI reveals a deep skepticism in Tanzania, where 51 percent of respondents believe AI will harm people over the next 20 years, and 62 percent oppose its usage—figures that surpass those of neighboring Kenya and Uganda (both at 57 percent). 

This mistrust extends beyond AI to the broader digital landscape. Disinformation campaigns have increasingly targeted the public, as demonstrated by a case involving Suzy Lyimo, CHADEMA's Speaker of the People’s Parliament. I

n manipulated recordings, she was falsely portrayed as announcing her party’s withdrawal from the November 27 local government elections. Such digital manipulation not only distorts public perception but also threatens democratic participation, further undermining trust in electoral processes.

The hidden manipulators

In 2022, a deepfake of Ukrainian President Volodymyr Zelenskyy falsely declaring surrender went viral. Such manipulations, amplified through AI-driven bots, present a real and immediate threat to the integrity of elections. In Tanzania, where trust in political information is already fragile, the potential for AI to exploit and deepen these doubts is especially concerning.

An IT expert who requested anonymity offered a rare glimpse into the secretive world of election manipulation in the country. Specializing in AI-driven images and videos, he explained how he creates “deepfakes” and doctored content designed to alter voters' perceptions of political candidates.

The expert described how advanced AI tools and deep learning algorithms are leveraged to produce hyper-realistic visuals and audio, making it nearly impossible for the average person to detect manipulations. “It’s getting harder for the average person to tell if something’s been manipulated or not,” he admitted.

These digital creations play on viewers' fears, loyalties, or biases, strategically crafted to catch and hold attention in fleeting seconds. He disclosed that while the financial rewards are significant, the ethical discomfort of using democratizing tools to subvert democracy often weighs on him. Yet, with the steady demand for these services during election seasons, he remains drawn to the work.

“We’re tasked with creating a reality that fits the agenda, regardless of what’s true,” he shared. The work, often funded by political clients, involves generating misleading news clips, manipulated images, and videos that blur the line between fact and fiction. The objective, he explained, is to flood social media with disinformation that subtly shifts voter sentiment.

Strategies for targeted misinformation

According to the expert, creating deceptive content is only part of the operation; there’s also a carefully orchestrated strategy to ensure it reaches target audiences. “We don’t just make videos and hope they go viral. There’s a whole team dedicated to seeding these clips in places where they’ll gain traction,” he explained.

Using bots, paid influencers, and accounts with substantial followings, these teams amplify the spread of disinformation, making it appear as if the content is organically popular. “Once we get a critical mass of shares or likes, it becomes self-sustaining; people spread it because they believe it’s true,” he said, noting that the organic-looking spread lends a legitimacy that’s challenging for voters to question.

A disturbing imbalance

Reflecting on the impact of his work, the expert expressed deep concerns about how quickly disinformation can destabilize political environments, particularly in a country like Tanzania, where internet access and digital literacy are rising rapidly. “People are getting more connected, but they don’t have the tools to identify what’s real and what’s fake,” he remarked.

This vulnerability makes voters especially susceptible to manipulation, which he finds troubling. Despite his continued involvement in this field, he wrestles with the ethical implications, recognizing that his creations directly influence democratic processes and the nation’s future. “It’s like we’re pulling the strings from the shadows, but at what cost?” He concluded, a question that underscores the dark implications of digital manipulation.

Finding solutions

Rukia Nassor, a 24-year-old artisan in Pemba, describes her frustration with the lack of reliable information. “Sometimes I can tell if something is fake, but not always. How do I confirm if it’s true?” Her experience underscores the widespread feeling of helplessness among citizens facing a flood of misleading information.

In Kigoma Ujiji, fishmonger Moses Luhya is equally concerned: “Misinformation and disinformation keeps growing in every election. People share false information online. Social media has become a minefield for political news and without access to trusted sources.”

Ester Chande, a resident of Musoma, in Mara Region observes the divisive impact of misinformation and disinformation on her community. False claims about political candidates create anger and mistrust, deepening divisions. She yearns for a platform to verify political news, especially for those not raised with digital tools and less able to separate fact from fiction.

Technology as a tool for transparency

Dr Egbert Mkoko, a lecturer at the University of Dar es Salaam, argues that while technology presents challenges, it can also be part of the solution. “We can’t fight against technology. Instead, we must use it responsibly,” he told The Guardian while advocating for the use of technology to expedite the election process and ensure transparency. He pointed to the United States’ efficient use of tech in elections as an example Tanzania could follow.

Dr Mkoko stresses the importance of using technology ethically and responsibly. “Technology should be employed without manipulation to tamper with elections, as this could ultimately damage democracy,” he warns. 

He believes that guidelines agreed upon by all stakeholders could ensure a balanced use of technology, allowing it to support rather than subvert the electoral process.

Dr Mkoko suggests that the independent electoral commission, political parties and civil society should work together to establish a framework that promotes transparency and accountability in Tanzania’s elections. Such a framework could set out rules for using AI and digital tools in ways that enhance electoral transparency, rather than compromising it.

The framework could also include public education initiatives to raise awareness about digital literacy and critical thinking, equipping citizens with the tools they need to identify misinformation and disinformation. Empowering Tanzanians to verify the information they consume, could help rebuild trust in the digital landscape.

Reclaiming democracy

As Tanzania approaches its elections, it stands at a pivotal moment. The influence of AI and digital disinformation on democratic processes is an undeniable challenge. However, through collaborative efforts and responsible technology use, Tanzania can address these issues and strengthen its democratic foundations.

Empowering citizens with digital literacy, establishing a transparent framework, and fostering cooperation among stakeholders are critical steps. In doing so, Tanzania can ensure that democracy thrives, even in the face of technological manipulation. By reclaiming control over digital tools, Tanzanians can safeguard their elections, their choices, and, ultimately, their future.

John Mwambiki from Sumbawanga District expressed: “We need someone to cut through the noise. For Tanzania, the challenge is to forge a digital landscape where truth prevails and citizens can engage in the democratic process with confidence and clarity. In the battle for truth, Tanzanians have a choice; to let technology divide them or to harness it in the pursuit of unity and transparency.

In our next edition, we will dive deeper into the forces shaping Tanzania's digital landscape ahead of the 2024 civic polls and 2025 general elections. We will examine the roles of key regulatory bodies, NGOs dedicated to democratic oversight, the Independent National Electoral Commission (INEC), political parties and tech firms working to counteract disinformation.