Profit over people: How Facebook fuels ‘mob violence’ in post-uprising Bangladesh
Facebook’s failure to moderate hate speech in Bangla has turned it into a weapon of mass mobilisation. In a country struggling to rebuild democracy, the platform’s algorithms are amplifying violence

In the last week of July 2025, the Hindu community in Rangpur was attacked by people protesting a Facebook post by a local teenager, allegedly for hurting religious sentiments.
This is the latest incident of "mob violence" in Bangladesh, a growing phenomenon that country has been grappling with since the regime change and resignation of former Prime Minister Sheikh Hasina on 5 August last year.
The July Uprising and the downfall of the authoritarian regime brought hope for the revival of democracy and an expectation of political and social stability. However, in the last year, newspaper headlines have been dominated by the news of increased violence, attacks on minority communities, women, and political activists of the Awami League, and the degradation of law and order.
Between August 2024 and March 2025, mob attacks killed 119 people and injured 74 individuals in 112 mob incidents.
Many of these mob incidents are either centred around a post or opinion published on Facebook or escalated with the platform's help.
Facebook is the most widely used platform in the country, with the eighth highest number of users in the world. The platform has been associated with many high-profile instances of political and religious violence in Bangladesh.
Over the last 15 years, Facebook has triggered serious anti-minority violence across the country. The attack on Buddhist community in 2012 in the South-eastern small town Ramu, Cox's Bazar, centering on a Facebook post (the post was not even written by the accused, someone shared in his timeline), was one of the first major instances of Facebook-inspired violence, where thousands of Muslims attacked a Buddhist area and at least 1,000 Buddhist families fled their homes.
Since then, Facebook has been providing a platform for initiating aggression, escalating mobs, and staging violence in Bangladesh. All subsequent incidents followed the 2012 Ramu template.
Triggered by a Facebook post, Muslim majority people attack the minority communities, be they religious or ethnic. What has Facebook done about the grave crisis in Bangladesh? Virtually nothing!
Several studies, commentaries, and civil society demands highlighted that Facebook's limited moderation capacity in Bangla has allowed harmful content to spread unchecked. The design of Facebook's algorithm prioritises emotionally charged and shareable content, thus becoming a vector for mob violence in current Bangladesh by amplifying or not moderating false claims.
Sometimes the damage is already done before any steps are taken. Considering the lack of digital literacy among the public and the lack of strong local legal mechanisms to hold social networking platforms like Facebook accountable, the platform can be the reason for serious social instability and crisis, which can cost the lives of the most vulnerable population of society.
Bangladesh is not a single case of suffering due to the neglect, problematic algorithm design, and market policy of Facebook. A United Nations report on 2017's genocide against Rohingya Muslims in Myanmar states that Facebook had a "determining role" in the violence. This event led to the displacement of more than 800,000 Rohingyas and a massive humanitarian crisis.
When the platform was asked to provide information on the genocide, Facebook described that as "unduly intrusive or burdensome". Across other South Asian countries and several African countries, a similar pattern of Facebook's complicity is also observed.
However, Facebook has been found to do very little to curb these incidents. A Wall Street Journal report showed that Facebook ignored its hate speech policies and allowed Islamophobic speech in India. This could be true for other societies as well.
Several studies, commentaries, and civil society demands highlighted that Facebook's limited moderation capacity in Bangla has allowed harmful content to spread unchecked. The design of Facebook's algorithm prioritises emotionally charged and shareable content, thus becoming a vector for mob violence in current Bangladesh by amplifying or not moderating false claims.
Besides, Facebook fails to ensure proper content moderation in the Global South. The platform relies on contractors for content moderation, where the number of moderators is very low and the moderators are overworked, underpaid, and exploited.
In South Asia, content moderators are paid as little as $6 a day. And content moderation is often less active and successful in moderating languages other than English. There is an extremely low number of content moderators for enormous linguistic groups in the Global South. As a predominantly Bangla-speaking nation, Bangladesh also bears the brunt. Therefore, the platform has neglected to facilitate mob mobilisation in Bangladesh.
The overall characteristics of the corporation show that it is only interested in higher engagement of the users and maximising profit. On the contrary, it has little commitment to social responsibilities, if not in theory, but in practice.
The evident lack of responsibility is alarming, especially for Bangladesh, as the country is experiencing a political transition and a declining law and order situation.
After the fall of a long-standing autocrat, some chaos may be expected, but what is going on now is in no way acceptable. The interim government has failed to control the mob violence, and sometimes their self-defence has encouraged a continuation of mob lynching.
However, the unchecked free platform given by Facebook to pave the way to initiate and accelerate violence against vulnerable communities is not new; it has been happening again and again in Bangladesh since 2012.
A safe internet is a right, and it is a must for everyone across the world. Silicon Valley-based corporations must not ignore countries from the Global South. To ensure a safe social media platform for countries like Bangladesh, Facebook, along with other platform corporations, should provide sufficient resources, more moderators with linguistic and cultural knowledge, and rethink and revise its algorithmic design so that hate speech and incendiary content cannot gain prominence.
The government of Bangladesh also needs to think about legal frameworks, such as the European Union's Digital Services Act, for ensuring increased transparency and accountability of the platforms. The government and social organisations also need to focus on increasing digital literacy and making people aware of social media disinformation and propaganda.
Priyanka Kundu is a Doctoral Candidate at the Department of Communication, University of Illinois Chicago (UIC). Kundu's research focuses on politics and technology, with a special emphasis on the Global South.
Dr Fahmidul Haq is a faculty member at Bard College in New York, US. He is also a human rights defender, digital content creator and public intellectual.