Cyber law gaps fail to shield victims of AI-driven abuses
Not a single AI-related case has yet been filed under new law
A woman was shocked to find a deepfake video of herself in an intimate scene with a celebrity circulating online – an entirely fabricated clip generated using artificial intelligence (AI).
"When I tried to file a case at the police station, they directed me to the court. But the court refused to accept my complaint citing technicalities," she told The Business Standard, speaking on condition of anonymity.
"Now I don't know where to go. The criminals are getting away with it," she added.
Her ordeal is far from isolated. As AI tools become more widely available in Bangladesh, experts warn that women are bearing the brunt of digitally manipulated abuse – and that the legal system is not prepared to respond.
Earlier this year, the government introduced the Cyber Security Ordinance 2025, replacing the Cyber Security Act 2023. The new law criminalises digital offences such as blackmail, revenge porn, sextortion, and the distribution of AI-generated child sexual abuse material.
Section 25 of the ordinance outlines penalties of up to two years' imprisonment or a fine of Tk10 lakh, or both. If the victim is a woman or child, the punishment increases to five years' jail or a Tk20 lakh fine.
However, legal experts say the law remains vague on how to deal with AI-generated offences – especially in terms of definitions, processes, and enforcement.
"The nature of cybercrime is evolving faster than our laws," said Mir Al-Amin Hossain, a Dhaka-based lawyer with over a decade of courtroom experience. "AI-generated offences, in particular, lack precise definitions and procedural clarity."
Md Rafiqul Islam, public prosecutor at the Dhaka Cyber Tribunal, said no AI-related case has yet been filed under the new ordinance in any police station or court in the country.
"The law doesn't specify how to prosecute such crimes," he said. "Victims often return empty-handed due to the ambiguity."
He noted that it is still unclear whether AI-related complaints should be filed directly or only after a preliminary investigation, and whether traditional police officers or specialised cyber experts should handle them.
"Regular law enforcement lacks the technical expertise to handle such cases effectively," he added. "The law also fails to clarify whether specialised digital experts should be engaged during investigations."
Senior advocate Azad Rahman said the legal process is too complicated for victims to navigate easily. "If someone tries to file a case over a serious AI offence at a police station, the allegation must first be investigated. Only after the report is submitted will the court consider taking up the case."
This lengthy process, he warned, increases the risk of further harassment and public shaming for the victim.
He also questioned the leniency of bail provisions. "All offences under Section 25 are currently bailable – even when the crime involves a deepfake with long-term reputational harm. You can destroy someone's life and still walk free on bail. That's unacceptable."
Touhidul Islam Sajib, a lawyer at the Dhaka Judge Court, said AI-enabled abuse is a rising threat that the Cyber Security Ordinance was designed to address. "Section 25 does cover a broad range of such crimes. But as technology evolves, the law must evolve with it," he said.
"If we don't keep updating our laws in step with technological change, we risk failing the very people the system is meant to protect," he added.
