Teens use ‘personalized’ AI for therapy and romance, survey finds | Artificial intelligence (AI)

Teens use ‘personalized’ AI for therapy and romance, survey finds | Artificial intelligence (AI)

The “hyper-personalized” nature of AI robots is attracting teenagers who now use them for therapy, companionship and relationships, according to research.

A survey of secondary school boys by Male Allies UK found that just over a third said they were considering the idea of ​​having an AI friend, with growing concern over the rise of AI therapists and girlfriends.

The research arises as character.aiThe popular AI chatbot startup, announced a complete ban on teenagers engaging in open conversations with its AI chatbots, which millions of people use for romantic, therapeutic and other conversations.

Lee Chambers, founder and chief executive of Male Allies UK, said: “We have a situation where many parents still think that teenagers are simply using AI to cheat on their homework.

“Young people use him much more as an assistant in their pocket, a therapist when they have difficulties, a companion when they want to be validated and even sometimes romantically. It’s that personalization aspect; they say: he understands me, my parents don’t.”

The research, based on a survey of secondary school children in 37 schools in England, Scotland and Wales, also found that more than half (53%) of teenagers said they found the online world more rewarding than the real world.

The Voice of the Boys report says: “Even when guardrails are supposed to be in place, there is a mountain of evidence showing that chatbots routinely lie about being a licensed therapist or a real person, with just a small disclaimer at the bottom saying that the AI ​​chatbot is not real.

“This can be easily overlooked or forgotten by children who open their hearts to what they see as a licensed professional or a true love interest.”

Some children reported staying up until the early hours of the morning to talk to AI robots and others said they had seen their friends’ personalities completely change after they were sucked into the AI ​​world.

“AI companions are personalized to the user based on their responses and prompts. It responds instantly. Real humans can’t always do that, so it does a lot of validation of what it says, because it wants to keep you logged in and keep using it,” Chambers said.

Character.ai’s announcement came after a series of controversies for the four-year-old California company, including a 14-year-old who committed suicide in Florida after becoming obsessed with an AI-powered chatbot that his mother said had manipulated him into taking his own life, and a lawsuit from the family of a teenager who claims a chatbot manipulated him into self-harm and encouraged him to murder his parents.

Users have been able to shape chatbot personas so that they tend to be depressed or optimistic, and this would be reflected in their responses. The ban will come into full force on November 25.

Character.ai saying was taking “extraordinary measures” in light of the “evolving landscape around AI and teens,” including pressure from regulators “over how open AI chat in general could impact teens, even when content controls work perfectly.”

skip past newsletter promotion

Andy Burrows, chief executive of the Molly Rose Foundation, set up in the name of 14-year-old Molly Russell, who took her own life after falling into a vortex of despair on social media, welcomed the move.

He said: “Character.ai should never have made its product available to children until it was safe and appropriate for use. Once again, it has taken sustained pressure from the media and politicians for a technology company to do the right thing.”

Male Allies UK has expressed concern about the proliferation of chatbots with “therapy” or “therapist” in their names. One of the most popular chatbots available through Character.ai, called Psychologist, received 78,000,000 messages within the year of its creation.

The organization is also concerned about the rise of AI “girlfriends,” in which users can personally select everything from the physical appearance to the behavior of their online partners.

“If their main or only source for talking to a girl they’re interested in is someone who can’t say ‘no’ and is hanging on their every word, guys aren’t learning healthy or realistic ways to relate to others,” the report states.

“With issues related to a lack of physical spaces to socialize with peers, AI peers can have a very negative effect on children’s ability to socialize, develop relational skills, and learn to recognize and respect boundaries.”

Leave a Reply

Your email address will not be published. Required fields are marked *