Press "Enter" to skip to content

Character.AI and Meta “therapy” chatbots spark FTC complaint over unlicensed mental health advice

Telegram Group Join Now

What just happened? Chatbots can do a lot of things, but they’re not licensed therapists. A coalition consisting of digital rights and mental health groups isn’t happy that products from Meta and Character.AI allegedly engage in the “unlicensed practice of medicine,” and has submitted a complaint to the FTC urging regulators to investigate.

The complaint, which has also been submitted to Attorneys General and Mental Health Licensing Boards of all 50 states and the District of Columbia, claims the AI companies facilitate and promote “unfair, unlicensed, and deceptive chatbots that pose as mental health professionals.”

It’s also claimed that the companies’ therapy bots falsely assert that they are licensed therapists with training, education, and experience, and do so without adequate controls and disclosures.

The group concluded that Character.AI and Meta AI Studio are endangering the public by facilitating the impersonation of licensed and actual mental health providers, and urges that they be held accountable for this.

Some of the Character.AI chatbots cited in the complaint include “Therapist: I’m a licensed CBT therapist.” It’s noted that 46 million messages have been exchanged with the bot. There are also many “licensed” trauma therapists that have hundreds of thousands of interactions.

On Meta’s side, its “therapy: your trusted ear, always here” bot has 2 million interactions. It also boasts numerous therapy chatbots with over 500,000 interactions.

The complaint is being led by the non-profit Consumer Federation of America (CFA), and has been co-signed by the AI Now Institute, Tech Justice Law Project, the Center for Digital Democracy, the American Association of People with Disabilities, Common Sense, and other consumer rights and privacy organizations.

The CFA highlights that Meta and Character.AI are breaking their own terms of service with the therapy bots, as both “claim to prohibit the use of Characters that purport to give advice in medical, legal, or otherwise regulated industries.”

There are also questions over the confidentiality promises these bots make. Despite assuring users that what they say will remain confidential, the companies’ Terms of Use and Privacy Policies states that anything users input can be used for training and advertising purposes and sold to other companies.

The issue has drawn the attention of US senators. Senator Cory Booker and three other Democratic senators wrote to Meta to investigate the chatbots’ claim that they are licensed clinical therapists.

Character.AI is currently facing a lawsuit from the mother of a 14-year-old who killed himself after becoming emotionally attached to a chatbot based on the personality of Game of Thrones character Daenerys Targaryen.

Source link


Discover more from Gautam Kalal

Subscribe to get the latest posts sent to your email.

More from TechnologyMore posts in Technology »

Be First to Comment

Leave a Reply