Right after polls closed on January 7, government-invited “independent” observers—including former US Congressman Jim Bates, whose past includes sexual-harassment allegations and campaign-finance controversies—told local media the elections were “free and fair.” Headlines such as “US observer calls election free and fair” spread rapidly, even though the US, UK, and Canadian embassies confirmed no official missions monitored the vote.
Digital Platforms as Vectors of Propaganda
Social media, online news sites, and messaging apps have become fertile ground for false narratives:
- Deepfake Videos A Financial Times investigation on December 14 exposed pro-government influencers using inexpensive AI tools to produce deepfake clips targeting the opposition BNP and foreign partners.
- Echo Chambers Polarised groups reinforce misinformation within closed networks, making debunking efforts slow to catch up.
- Coordinated Op-eds In September 2023, AFP’s fact-checkers uncovered hundreds of fake-expert commentaries lauding government policies—even some hosted on official government websites.
Real-World Consequences
Misinformation and disinformation threaten societal stability and public safety:
- Communal Tensions Rumours against religious or ethnic minorities can spark violence, as unverified claims fuel mob vigilantism.
- Eroded Trust With reliable outlets censored or self-censored, citizens turn to unverified sources, deepening cynicism toward all media.
- Democratic Backsliding Falsehoods about electoral processes delegitimise institutions, paving the way for authoritarian tactics.
Legal Framework and Its Unintended Effects
Since October 2018, the Digital Security Act (DSA) has granted broad powers to detain without warrant and censor digital content. Although designed to curb extremism and hate speech, its vague provisions have been used to silence dissenting journalists and civil-society voices. Critics argue that the DSA’s overreach has inadvertently driven citizens toward fringe sources, worsening misinformation cycles.
Platform Accountability: Political Ads Transparency
A study by Digitally Right—“Hits and Misses: An Examination of Meta’s Political Ad Policy Enforcement in Bangladesh”—found significant gaps in Facebook’s political-ads disclosure:
- Under-Detection 50 active political ads (nearly half from official party pages) evaded Meta’s political-ad labelling systems, leaving users unaware of paid political messaging.
- Vague Disclaimers Over 80% of analysed “paid for by” notices lacked precise addresses; only 17% included complete, verifiable contact information.
A Path Forward: Collaborative Solutions
Addressing the disinformation crisis requires coordinated action:
- Balanced Legislation Amend the DSA to narrowly define punishable digital offences while safeguarding free expression.
- Media Literacy Implement nationwide campaigns—partnering banks, schools, and NGOs—to teach citizens critical evaluation of online content.
- Tech-Platform Oversight Mandate regular audits of ad-classification algorithms and require all social platforms to publish political-ad data specific to Bangladesh.
- Civil-Society Engagement Encourage collaboration between fact-checkers, independent media, and academia to rapidly verify viral claims.
By restoring trust in institutions and promoting transparency online, Bangladesh can protect its democratic foundations and foster an informed, resilient citizenry.