top of page

Fake News Meets Digital Illiteracy: How Misinformation Fuels Societal Tensions

  • Writer: Welchman Keen
    Welchman Keen
  • Apr 14
  • 10 min read

Updated: Apr 15

Philip Victor | 14 April 2025



There is no shortage of fake news sweeping across social media platforms worldwide, causing real-world harm in multiple countries. These range from disinformation triggering widespread panic and social unrest to hyper-realistic deepfake scam videos impersonating politicians and CEOs tricking thousands into financial ruin.


These incidents highlight a growing crisis; as developing countries embrace the digital revolution, the very tools designed to connect and inform are increasingly becoming conduits for misinformation and a real-world threat capable of destabilising societies. Amplified by low rates of digital literacy, the disparity creates a perfect storm where freedom of information, without the accompanying ability to discern its veracity, becomes a recipe for societal discord and erosion of public trust.


This thought piece explores how the intersection of fake news and digital illiteracy fuels societal discord, and why closing the digital literacy gap is no longer optional – it is imperative.



Digital Access is Not Digital Literacy


Efforts to connect the unconnected have led to higher rates of internet penetration in many developing countries. With the exception of Laos, Cambodia, and Myanmar, many countries in Southeast Asia now boast impressive internet penetration rates of up to 80%. This digital transformation extends beyond mere connectivity; it has catalysed a shift towards more sophisticated, digitalised services that enhance convenience and efficiency. As these nations progress towards cashless economies and paperless operations, citizens are experiencing a dual benefit: not only do they have easy access to information, but also to a wide array of products and services.


Despite growing internet penetration rates in many developing countries, digital literacy lags behind. While internet penetration simply refers to the availability of internet access, digital literacy encompasses a far more comprehensive set of skills essential for effectively navigating the digital landscape. UNESCO defines digital literacy, as “the ability to access, manage, understand, integrate, communicate, evaluate and create information safely and appropriately through digital technologies for employment, decent jobs and entrepreneurship.” This definition highlights several key pillars:


  • Information Literacy: The ability to find, evaluate, and use digital information effectively.


  • Communication Literacy: The skills needed to communicate and collaborate using digital tools.


  • Technical Literacy: The knowledge required to use digital devices and platforms effectively and safely.


Challenges in Developing Digital Literacy


The absence of fundamental digital literacy skills transforms the increased convenience offered by digital access and services into a significant vulnerability, potentially exposing digitally illiterate populations to risks they may not be equipped to navigate, or leave them susceptible to misinformation and cybercrime. Some of the factors contributing to this digital literacy gap in Southeast Asia include:


  • Lack of Structured Digital Literacy Programmes: Many countries lack comprehensive, nationwide digital literacy initiatives, leaving large segments of the population without formal training in essential digital skills.


  • Vulnerable Age Groups: Older adults and young children are often the most susceptible to digital risks, with limited resources tailored to their specific needs. It is also crucial to recognise that human error remains the weakest link in cybersecurity, particularly so among seniors in both developed and developing countries.


  • Rural Access Challenges: The urban-rural divide in internet access and quality creates disparities in digital literacy development, with rural areas often lagging behind.


  • Curricula Deficiency: Educational systems countries have been slow to integrate digital literacy into their core curricula, leaving students ill-prepared for the digital world. In a 2021 survey, 61% of individuals aged 10–24 were not being taught digital skills in school, with only a small fraction exposed to digital skills education in Lao PDR and Myanmar.


  • Limited Access to Technology: Despite increasing internet penetration, many individuals still lack access to personal devices or reliable internet connections, hindering their ability to develop digital skills.


  • Low Public Awareness: There's often a lack of understanding about the importance of digital literacy, leading to low prioritisation at both individual and policy levels.


  • Poverty Implications: Economic constraints prevent many from accessing digital devices and internet services, creating a cycle of digital exclusion.


Developing digital literacy is therefore more than just capacity building; it encompasses interventions in various aspects such as addressing broader development concerns and reviewing contemporary modes of learning and existing policies.



Types of Misinformation on Social Media


Digital literacy is critically important given that social media platforms, such as Facebook, YouTube and TikTok, often serve as the primary gateway to the internet for new users. While these platforms offer unprecedented access to information and connectivity, they also present unique challenges:


  • Information Overload: Users are bombarded with vast amounts of information, making it difficult to discern credible sources from unreliable ones.


  • Opinion Bubbles: Social media platform algorithms can create isolated information environments by recommending content that aligns with a user’s existing beliefs and preferences. This consequently limits exposure to diverse perspectives and reinforces existing biases.


  • Rapid Spread of Misinformation: The viral nature of social media allows false information to spread quickly, often outpacing fact-checking efforts.


Digital illiteracy further amplifies the impacts of these challenges:


  • Limited Fact-Checking: Users may be unfamiliar with fact-checking tools, or lack the skills or inclination to verify information before sharing.


  • Viral Content Sharing: The tendency to forward sensational content without verification.


  • Language Barriers: In multilingual societies, misinformation can spread more easily when users are limited to content in their native language.


  • Misunderstanding of Media Literacy: Lack of critical thinking skills when consuming online content.


Digital literacy at the user level therefore demands a comprehensive understanding of diverse misinformation formats and the ability to critically evaluate information sources.


Fake News: Motivations and Variations


Fake news encompasses a range of misleading content, such as misinformation (false or inaccurate information spread without malicious intent); and disinformation (deliberately created and shared false information to deceive or manipulate). Common forms include:


  • Misleading Headlines: Sensationalised or out-of-context titles that don’t accurately reflect the content, potentially omitting vital nuances and perspectives on the matter. A research study even concluded that such headlines in mainstream media contributed to slower rates of vaccination in the US.



  • Manipulated Content: Genuine content that has been altered to change its meaning or context. A prime example occurred during the 2024-2025 California wildfires, where a video of residents salvaging belongings from their burning home was falsely circulated as footage of looters, exploiting a real crisis to spread misinformation and stoke social tensions.


  • Clickbait: Content designed to attract attention and encourage visitors to click on a link. The use of clickbait can range from harmless efforts to boost one’s Youtube channel views, to more sinister links to malware.


  • Deepfakes: Synthetic media where a person’s likeness is replaced with someone else’s. Recent examples of this include the impersonation of government officials and police officers in Southeast Asia as a means of monetisation or extortion.


  • Conspiracy Theories: Explanations for events that invoke conspiracies without evidence, such as the ‘Plandemic’ video series that spread false claims about COVID-19, leading to widespread confusion about public health measures.


The motivations behind the dissemination of fake news are diverse and complex, often blurring the lines between legitimate expression and harmful misinformation. These motivations can be broadly categorised into four main areas:


  • Political Gain: Actors may spread false information to influence public opinion, discredit opponents, or sway elections. This manipulation of the information landscape can have far-reaching consequences for democratic processes.


  • Financial Profit: The monetization of online content has created a powerful incentive for generating sensational or false information. Clickbait headlines and controversial content often drive higher engagement rates, translating into increased ad revenue.


  • Social Influence: In the digital age, social capital is measured in followers, likes, and shares. Some individuals or groups spread misinformation to boost their online presence and perceived authority.


  • Ideological Promotion: Fake news can be a tool for spreading particular beliefs or worldviews, often targeting audiences susceptible to confirmation bias.


The democratisation of content creation has thus led to a tension between free speech and the potential for misinformation. Platforms face the challenging task of balancing content moderation with free speech principles, often struggling to draw the line between legitimate expression and harmful misinformation.


The ease of monetisation in the digital space has further complicated this issue. The potential for financial gain can incentivise the creation of sensational or false content, prioritising engagement over accuracy. This dynamic is exemplified by the 2016 Macedonia Fake News Complex, where young people in Veles, Macedonia, created numerous websites with sensationalised and false political content targeting U.S. audiences. These sites generated significant ad revenue due to high engagement rates, demonstrating how monetary incentives can fuel the creation and spread of misinformation. The adverse impacts of fake news are further explored in the following section.



Consequences of Misinformation and Disinformation


The proliferation of misinformation and disinformation has far-reaching consequences that extend beyond individual misconceptions. From influencing electoral outcomes to exacerbating public health crises, the impact of inaccurate information is profound and multifaceted.


Erosion of Electoral Integrity


The integrity of democratic processes, particularly free and fair elections, has become increasingly vulnerable to the corrosive effects of misinformation in the digital age. This was illustrated in the aftermath of the 2020 U.S. Presidential Election, where a tidal wave of false claims about voter fraud swept across social media platforms. The deluge of misinformation reached its peak with the January 6th Capitol riot, an event that shook the very foundations of American democracy and left deep, lasting scars on the nation's political landscape.

The long-term impact of such widespread election-related misinformation extends far beyond a single event or election cycle. It has sown persistent doubts about election integrity among significant portions of the electorate, potentially leading to decreased voter turnout in future elections. Moreover, the proliferation of false narratives has exacerbated political polarisation, further eroding trust in democratic institutions and processes.


Public Health in Peril


The spread of health-related misinformation in the digital age has emerged as a critical public health concern, with potentially life-threatening consequences. This issue came to the forefront during the global COVID-19 pandemic, where the rapid proliferation of anti-vaccine misinformation on social media platforms led to widespread vaccine hesitancy. The impact of such false information extended beyond COVID-19, affecting other public health initiatives. For instance, in Samoa, a devastating measles outbreak in late 2019 was exacerbated by vaccine hesitancy, resulting in 83 deaths, mostly children, and over 5,700 cases.


In the long term, the erosion of trust in public health institutions, cultivated by persistent exposure to false narratives, threatens to undermine future health initiatives and crisis responses. This loss of confidence can lead to a resurgence of preventable diseases as vaccination rates decline. The Samoan measles outbreak demonstrated how misinformation can result in preventable deaths, prolonged states of emergency, and overwhelmed healthcare systems. As digital platforms continue to serve as primary sources of health information, addressing the spread of misinformation has become crucial for safeguarding public health and preparing for future global health challenges.


Education at Risk


In Pacific Island nations, the spread of climate change misinformation poses a critical threat to education and community preparedness. These countries, already experiencing the harsh realities of rising sea levels and extreme weather events, face additional challenges when false narratives infiltrate their educational systems. Misinformation campaigns, often originating from climate change denial groups in countries like the United States and Australia, have far-reaching effects on these vulnerable island communities, undermining efforts to educate and equip residents with accurate knowledge about the environmental challenges they face.


The long-term impact of such educational misinformation is particularly severe for Pacific Island nations. It hinders the development of scientific literacy, potentially leading to delayed action on climate change adaptation and mitigation strategies. This could leave these communities ill-prepared to face the existential threat posed by climate change, impeding crucial resilience-building efforts. Moreover, the spread of misinformation risks creating a generation less equipped to innovate and implement sustainable solutions, potentially exacerbating the already dire consequences of climate change for these island nations.



Combating Misinformation and Fostering Digital Literacy


A multifaceted approach is needed to combat misinformation and enhancing digital literacy.


Regulatory Frameworks and AI: A Delicate Balance


Governments worldwide are grappling with the challenge of regulating the digital information space without infringing on free speech. While countries like China and South Korea have implemented stringent measures against deepfakes and harmful content, the global community is still searching for a balanced approach. The future of regulation lies in adaptive policies that can keep pace with technological advancements.


AI is emerging as a potentially powerful ally in this fight. From Natural Language Processing algorithms that detect patterns in false information to predictive modelling that anticipates misinformation trends, AI tools are becoming increasingly sophisticated. Open-source solutions are also democratising access to these technologies, empowering individuals and organisations to combat misinformation effectively.

 

Evolving Role of Social Media Platforms


Social media companies find themselves at the epicentre of the misinformation storm. Their response – implementing measures like limiting message forwarding and enhancing content moderation – reflects a growing acknowledgment of their responsibility. However, recent decisions, such as Facebook's approach to fact-checking, highlight the ongoing tension between platform autonomy and public interest.


The future may see social media platforms evolve into more active custodians of digital truth, potentially leveraging blockchain technology for content verification or implementing AI-driven real-time fact-checking. The challenge lies in balancing these measures with the preservation of free expression and diverse viewpoints.

 

Empowering Individuals Key to Digital Resilience


While technological and regulatory solutions are crucial, the most potent defence against misinformation lies in empowering individuals. Initiatives like the ASEAN Digital Literacy Programme are paving the way for a more digitally savvy populace. The next frontier in digital literacy education involves personalised, AI-driven learning experiences that adapt to individual needs and cultural contexts. Strategies for critical information assessment are evolving beyond simple fact-checking. Future-focused approaches may include:


  • Cognitive Debiasing Techniques: Training individuals to recognise and counteract their own cognitive biases.


  • Digital Footprint Analysis: Teaching users to trace the origin and spread of information across platforms.


  • Collaborative Verification Networks: Fostering community-driven fact-checking ecosystems.


  • Emotional Intelligence in Digital Spaces: Developing skills to navigate the emotional landscape of online information.

 


Conclusion


The battle against misinformation is not just a technological challenge – it is society’s responsibility. Without urgent action, the digital divide will not only be about access to the internet but also about the ability to discern truth from falsehood. Governments must integrate digital literacy into national education systems. Tech companies must move beyond reactive content moderation and proactively invest in AI-driven fact-checking tools. Educators and community leaders must empower individuals with the critical thinking skills necessary to navigate today’s complex information landscape.

 

Most importantly, every internet user has a role to play. Before sharing a piece of content, pause. Question its source. Cross-check with reliable outlets. In an era where falsehoods spread faster than facts, the ability to think critically is our most powerful defence. The future of democracy, public trust, and social stability depends on it.”

 

Philip Victor is Partner and Managing Director (APAC) at Welchman Keen and leads the Digital Transformation Practice. With over 30 years of experience, Philip helps equip developing countries with the essential capacities to drive and sustain their digital transformation journeys.


Disclaimer: This article is only intended for general reading/informational purposes. Under no circumstances is it to be relied upon in substitution for specific advice on any issue(s) that may arise relating to its subject matter. 



 
 
bottom of page