Psychologists at the University of Kent are suggesting people think wisely about their use of ChatGPT this Valentine’s Day after new research has revealed that we judge people most when they use AI to write love letters, apologies and wedding vows.
Their findings suggest that people tend to judge those who outsource tasks to artificial intelligence more negatively than those who do the work themselves, especially where tasks are personal or emotionally meaningful.
Across six studies involving nearly 4,000 UK participants, the researchers examined how people perceive those who use AI tools such as ChatGPT to complete a wide range of tasks, from writing computer code and daily schedules to planning a city tour for a friend or writing a marriage proposal. While outsourcing to AI often led to more negative impressions overall, this was reflected most strongly for socio-relational tasks.
Using AI for personal messages led people to see the user as less caring, less authentic, less trustworthy and lazier, even when the writing itself was high quality and users were honest about their AI use.
The findings help explain why AI use in personal contexts can provoke strong reactions. In one widely shared Reddit post, a newlywed described feeling deeply hurt after discovering her partner had used ChatGPT to write his wedding vows, a reaction that, the researchers suggest, reflects broader social norms around effort and sincerity.
Dr Scott Claessens, a researcher on the 5-year Trust in Moral Machines project based within the School of Psychology, said: ‘People don’t just judge what you produce, they judge how you produce it.’
Dr Jim Everett, a Reader in Psyschology, explained further: ‘If you use AI for these kind of social tasks that bind us together, you risk being judged not only because you didn’t put effort in, but because it makes people think you care less about the task and what it represents.’
By contrast, using AI for more practical or technical tasks – such as writing a dinner recipe or organising schedules for Valentine’s Day – attracted far less criticism.
Concerns about machines writing love letters are not new. In George Bernard Shaw’s 1898 play Candida, a character jokes about a device that could effortlessly produce romantic letters, only to question whether such letters would mean anything at all. More than a century later, the research suggests people remain uneasy with the same idea.
As AI becomes embedded in everyday life, the study highlights a trade-off between efficiency and social meaning: while AI can save time, using it for personal communication may come at a reputational cost.
Dr Everett reminds us: ‘In a world of algorithm-mediated interactions, AI is no substitute for investing effort into our interpersonal relationships.’
Dr Jim Everett leads the five year Trust in Moral Machines project in the University of Kent’s School of Psychology, with support from Dr Edmond Awad from Exeter University and funding from the Engineering and Physical Sciences Research Council. His full paper, ‘Negative perceptions of outsourcing to artificial intelligence’ is published in Computers in Human Behaviour, https://doi.org/10.1016/j.chb.2025.108894.