It’s cheap, quick and available 24/7, but is a chatbot therapist really the right tool to tackle complex emotional needs?

Last autumn, Christa, a 32-year-old from Florida with a warm voice and a slight southern twang, was floundering. She had lost her job at a furniture company and moved back home with her mother. Her nine-year relationship had always been turbulent; lately, the fights had been escalating and she was thinking of leaving. She didn’t feel she could be fully honest with the therapist she saw once a week, but she didn’t like lying, either. Nor did she want to burden her friends: she struggles with social anxiety and is cautious about oversharing.

So one night in October she logged on to character.ai – a neural language model that can impersonate anyone from Socrates to Beyoncé to Harry Potter – and, with a few clicks, built herself a personal “psychologist” character. From a list of possible attributes, she made her bot “caring”, “supportive” and “intelligent”. “Just what you would want the ideal person to be,” Christa tells me. She named her Christa 2077: she imagined it as a future, happier version of herself.

Continue reading…

Leave a Reply

Your email address will not be published. Required fields are marked *

You May Also Like

Growing numbers of NHS nurses quit within three years, study finds

Stress and exhaustion from regular 12-hour shifts partly to blame for burnout,…

Rishi Sunak to stop ministers’ daily TV and radio interviews – reports

Ministers will only appear on shows like BBC Radio 4’s Today and…

Tesla Looking Into Building Lithium Refinery

By Dean Seal Updated Sept. 9, 2022 3:02 pm ET Listen to…

‘Dangerous misogynist’ Andrew Tate booted from Instagram and Facebook

The self-described sexist was removed for violating Meta’s policies on ‘dangerous organizations…