Some ChatGPT users have noticed a strange phenomenon recently: occasionally, the chatbot refers to them by name as it reasons through problems.
That wasn’t the default behavior previously, and several users claim ChatGPT is mentioning their names despite never having been told what to call them.
Reviews are mixed. One user, software developer and AI enthusiast Simon Willison, called the feature “creepy and unnecessary.”
Another developer, Nick Dobos, said he “hated it.” A cursory search of X turns up scores of users confused by and wary of ChatGPT’s first-name basis behavior.
“It’s like a teacher keeps calling my name, LOL,” wrote one user. “Yeah, I don’t like it.”
Does anyone LIKE the thing where o3 uses your name in its chain of thought, as opposed to finding it creepy and unnecessary? pic.twitter.com/lYRby6BK6J
It feels weird to see your own name in the model thoughts. Is there any reason to add that? Will it make it better or just make more errors as I did in my github repos? @OpenAI o4-mini-high, is it really using that in the custom prompt? pic.twitter.com/j1Vv7arBx4
An article published by the Valens Clinic, a psychiatry office in Dubai, may shed some light on the visceral reactions to ChatGPT’s name use. Names convey intimacy. But when a person — or chatbot, as the case may be — uses a name a lot, it comes across as inauthentic.
“Using an individual’s name when addressing them directly is a powerful relationship-developing strategy,” writes Valens. “It denotes acceptance and admiration. However, undesirable or extravagant use can be looked at as fake and invasive.”
In a similar vein, perhaps another reason many people don’t want ChatGPT using their name is that it feels ham-fisted — a clumsy attempt at anthropomorphizing an emotionless bot. In the same way that most folks wouldn’t want their toaster calling them by their name, they don’t want ChatGPT to “pretend” it understands a name’s significance.
This reporter certainly found it disquieting when o3 in ChatGPT earlier this week said it was doing research for “Kyle.” (As of Friday, the change seemingly had been reverted; o3 called me “user.”) It had the opposite of the intended effect — poking holes in the illusion that the underlying models are anything more than programmable, synthetic things.