Press "Enter" to skip to content

Character.ai renounced agi. Now he is selling stories

“Artificial intelligence is expensive. We are honest about it,” says Anand.

WhatsApp Group Join Now
Telegram Group Join Now

Growth against safety

In October 2024, the mother of a teenager who died of suicide presented a unjust death suit Against the technologies of the characters, its founders, Google and Alphabet, claiming that the company has targeted her son with “anthropomorphic, hypersexual and frighteningly realistic experiences during programming [the chatbot] Travoca himself like a real person, an authorized psychotherapist and an adult lover. “At the time, a spokesman for the character. he told the CNBC That the company was “broken by the tragic loss” and took the safety of our users very seriously “.

The tragic accident put the character. The intense control. At the beginning of this year, the US senators Alex Padilla and Peter Welch he wrote A letter to different artificial intelligence company platforms, including character. Ai, highlighting the concerns for “the risks to mental health and safety placed to young users” of the platforms.

“The team has taken it very responsible for almost a year,” Anand tells me. “Artificial intelligence is stochastic, it’s a bit difficult to always understand what is coming. So it’s not an investment once.”

This is of fundamental importance because the character is growing. The startup has 20 million monthly active users who spend 75 minutes a day chatting with a bot (a “character” of character. The company’s users’ base is 55 % of women. More than 50 percent of its users is Gen Z or Gen Alpha. With that growth the real risk arrives: what is Anand doing to protect its users?

“[In] In the last six months, we have invested a disproportionate amount of resources in order to serve less than 18 in a different way than over 18, which was not the case of last year “, says Anand.” I can’t say: “Oh, I can slap a 18+ label on my app and say to use it for NSFW”. You end up creating a very different app and a different small scale platform. “

More than 10 of the 70 employees of the company work full -time on trust and security, Anand tells me. They are responsible for the construction of safeguards such as age verification, separate models for users under the age of 18 and new features such as Parents’ intuitionswho allow parents to see how their teenagers are using the app.

The Under 18 model was launched last December. It includes “a closer set of characters searching on the platform”, according to the spokesman for the Kathryn Kelly company. “The filters have been applied to this set to remove the characters relating to sensitive or mature topics.”

But Anand says that the security of AI will require more than simple technical changes. “Making this platform safe is a partnership between regulators, the United States and parents,” says Anand. This is what makes it so important to look at her daughter with a character. “This must be safe for her.”

Beyond the company

The AI Company market is booming. Consumers all over the world have spent $ 68 million in company AI in the first half of this year, an increase of 200 % compared to last year, according to an estimate Cited by CNBC. Ai’s startups are shooting for a slice of the market: Xai has released a disturbing companion and porn in July and even Microsoft invoices Her Copilota chatbot as a Ai companion.

So how do you distinguish Character.ai in a crowded market? It is completely removed.

Source link

Be First to Comment

Leave a Reply