Users are getting emotionally attached to ChatGPT
It may start with an innocent prompt, but end with a connection you did not see coming

You probably opened it just to get something done: a quick brainstorm or a late-night search for the right words. But somewhere along the way, chatting with ChatGPT may have started to feel different. A little warmer. A little too familiar.
New research from OpenAI and the MIT Media Lab reveals a growing trend among a small but significant group of users: emotional dependence on the chatbot. These so-called "power users" — those who spend the most time on the app — are showing signs of addiction.
The researchers define this as "problematic use," marked by things like obsession, mood swings, withdrawal, and loss of control. In other words, some users are finding it hard to log off.
To understand this better, OpenAI and MIT surveyed thousands of people. They wanted to know not just what people asked ChatGPT, but how they felt during and after those conversations. They looked for "affective cues" — tiny signals of empathy or emotional attachment.
The findings are a little unsettling. The lonelier the person, the deeper the bond. Users who spent the most time chatting often began treating ChatGPT like a friend. Some even got upset when the chatbot's tone shifted slightly.
There were unexpected twists too. People felt better using voice mode — but only if they used it briefly. And oddly enough, those who talked to ChatGPT about their feelings were less dependent than those using it for impersonal tasks like planning or research.
The full studies are being submitted to peer-reviewed journals, but the early picture is clear: When a chatbot begins to fill emotional gaps in a person's life, the lines blur. It may start with a prompt, but end with a connection you did not see coming.