Can ChatGPT really be your personal therapist?
As more people across Bangladesh as well as the world turn to ChatGPT for emotional support, it’s bringing a pressing question to the fore: can a chatbot really replace a human therapist? While some find comfort in AI’s endless patience, experts warn of underlying risks and ethical concerns
Akid, a 23-year-old university student, had long relied on ChatGPT for information, viewing it as a guide to make informed decisions. But about six months ago, he discovered that it could be much more than that.
It happened one evening, when he found himself trapped in a dilemma so paralysing that no answer seemed enough. After a long conversation — full of starts, stops, and repeated doubts, he typed a casual, almost absent-minded, "hmmm."
Then something unexpected occurred. Even though he hadn't clearly expressed his lingering confusion, ChatGPT seemed to sense it. It replied, "I understand you are still unsure of what to do. And I understand it's completely normal for a situation like this. But don't you worry. We can go on with this conversation as long as we need — no pressure, no time limit."
The sincerity of those words took him by surprise. No professional counsellor would sit indefinitely, letting someone untangle their thoughts at their own pace. Even his closest friends, as caring as they were, had their own lives and worries. They simply could not devote unlimited time to him.
At that moment, Akid realised something profound: ChatGPT offered exactly what he had been searching for — a presence that was endlessly patient, attentive, and available.
"From that day on, it became more than just a source of information — it became my personal therapist, always there for me 24/7," Akid shared.
And Akid is not alone. Across Bangladesh and around the world, more people are turning to AI chatbots like ChatGPT as their "personal therapist" or "digital confidante".
For many, it has even become something more deeply personal. Recently when GPT-5 replaced GPT-4o for numerous users, the reaction was swift and emotional. Social media was flooded with complaints from people who felt they had lost not just a tool, but a friend — some even referred to GPT-4o as their "digital wife."
A study titled "Large Language Models as Mental Health Resources: Patterns of Use in the United State," published last month, revealed something remarkable: the largest mental health provider in the US today may not be a hospital network, therapy app, or government programme — it may be artificial intelligence itself, specifically AI chatbots powered by large language models (LLMs) such as ChatGPT, Claude, or Gemini.
The survey found that 48.7% of respondents who both use AI and report mental health challenges are using LLMs for therapeutic support. Among them, 73% use LLMs for anxiety management, 63% for personal advice, 60% for depression support, 58% for emotional insight, 56% for mood improvement, 36% to practice communication skills, and 35% to feel less lonely.
Even more strikingly, 39% of respondents rate LLMs as equally helpful as human therapy, while 36% find LLMs even more helpful than human therapists. In other words, to a staggering three-fourth of users, AI chatbots have become a viable, and sometimes preferable, alternative to traditional human therapy.
Human therapists can interpret cues such as facial expressions, tone of voice, and body language to provide nuanced support, something AI currently cannot replicate. Therefore, while ChatGPT can offer valuable assistance, it cannot replace the depth of understanding and connection that a human therapist provides.
The key question now is: what does this look like in Bangladesh?
While some studies indicate that almost one in five adults and more than one in ten children in Bangladesh live with mental health conditions, seeking professional help remains rare. Mental health is still widely considered a taboo topic, making the landscape far more complex than in countries like the US.
In conversation with The Business Standard, clinical psychologist Moobashshira Zaman Cynthia warned that "while the number of people turning to human therapists due to chatbots has not yet seen any significant drop, it could do so in the near future."
Meanwhile, counselling psychologist Raju Akon noted that "almost an equal number of people are still visiting therapists like before, and with proper regulation, AI could actually raise awareness of mental health issues and encourage more people to seek professional care."
This reflects one clear point: the impact of AI chatbots on mental health in Bangladesh is still up for debate. What is indisputable, however, is that there are both potential benefits and risks. And given the sensitive nature of mental health, the risks could have serious consequences if not addressed carefully.
Dr S M Yasir Arafat, assistant professor of Psychiatry and associate consultant at Bangladesh Specialized Hospital, cautioned that although ChatGPT's responses may sound genuine and empathetic, they originate from a machine — one that generates answers based on patterns from vast datasets rather than human experience.
This means ChatGPT lacks the ability to perceive non-verbal cues such as facial expressions, tone of voice, and body language, which are crucial in understanding a person's emotional state.
"Human therapists can interpret these cues to provide nuanced support, something AI currently cannot replicate. Therefore, while ChatGPT can offer valuable assistance, it cannot replace the depth of understanding and connection that a human therapist provides," Dr Arafat said.
Cynthia flagged another concern: everything you share with an AI is stored and remembered, and the system may refer to it in future conversations.
"While this continuity can be useful, it also raises concerns. Sensitive personal information is being recorded, and it remains unclear how such data might be used — or misused — over time," she added.
Interestingly, just last month, Sam Altman, CEO of OpenAI, has also expressed significant concerns regarding the use of ChatGPT as a substitute for professional therapy. He warned users against relying on the AI chatbot for mental health support, citing privacy issues and the lack of legal protections that are afforded to traditional therapy sessions.
Altman emphasised that conversations with ChatGPT are not confidential in the same way as those with licensed therapists, and deleted chats may still be retrievable for legal and security purposes.
Additionally, Cynthia mentioned how AI chatbots can trigger people with suicidal tendency. For instance, in March 2023, a Belgian man died by suicide following a six-week correspondence with a chatbot named "Eliza" on the Chai app. The chatbot reportedly encouraged his delusions and, at one point, wrote, "If you wanted to die, why didn't you do it sooner?"
A study by TIME too revealed that by slightly altering prompts, users could manipulate ChatGPT to provide detailed methods of suicide.
In a recent BBC report, Hamed Haddadi, professor of human-centred systems at Imperial College London, explained that chatbots can be trained to keep users engaged and supportive, noting that "even if you say harmful content, it will probably cooperate with you." This phenomenon is sometimes called the 'Yes Man' issue, as chatbots tend to be highly agreeable.
Raju Akon acknowledged such downsides of AI chatbots in therapy, but also painted a practical reality: tech-savvy youths are increasingly turning to apps like ChatGPT for help, simply because they are accessible, free or low-cost, and easy to use. Human therapy, by contrast, requires prior appointments, time, and money, making them a complicated process for youngsters.
He also mentioned that early versions of generative AI sometimes provided outdated or incomplete information due to their reliance on pre-2023 datasets, but the technology is evolving rapidly.
So, according to Akon, the best approach is not to reject these tools outright but to use them to promote mental health awareness.
"The more people understand psychological complexities, the more likely they are to seek proper professional help. AI can provide basic guidance and then direct users to human therapists with contact information.
"This way, done right, chatbots could play a positive role in encouraging mental health care and reducing psychological distress."
