Technical News

Character. Now he sells stories

“AI is expensive. Let’s be honest about it, ”says Anand.

Growth in relation to security

In October 2024, The Mother of A Teen Who Died by Suicide Filed A Wrong Death Suit Against Character Technologies, It Founders, Google, and Alphabet, Alleging The Company Targeted Her Son With “Anthropomorphic, Hypersexualized, and Frighteningly Realistic Experiences, While programming [the chatbot] To distort themselves as a real person, approved psychotherapist and adult lover. At the time, a character spokesperson. ARA told CNBC that society had “the heart broken by tragic loss” and had taken “the safety of our users very seriously”.

The tragic incident put the character. Under a meticulous examination. Earlier this year, American senators Alex Padilla and Peter Welch wrote a letter to several AI Company platforms, including the character. Ai, highlighting the concerns about “the mental health and security risks posed to young users” of platforms.

“The team has been taking this in a very responsible way for almost a year now,” said Anand. “The AI is stochastic, it is difficult to always understand what is happening. It is therefore not a single investment.”

It is of extremely important importance because the character is developing. The startup has 20 million monthly active users who pass, on average, 75 minutes a day to chat with a bot (a character “character”. The company’s user base is 55% women. More than 50% of its users are Gen Z or Gen Alpha. With this growth comes a real risk – what does Anand do to ensure the safety of its users?

“”[In] In the past six months, we have invested a disproportionate quantity of resources to be able to serve less than 18 years differently by more than 18 years, which was not the case last year, “said Anand.” I cannot say: “Oh, I can slap an 18+ label on my application and say to use it for NSFW.” You end up creating a very different application and a different small platform. »»

More than 10 of the 70 employees of the company work full time in trust and safe, Anand said. They are responsible for the construction of guarantees such as age verification, separate models for users under the age of 18 and new features such as parental information, which allow parents to see how their adolescents use the application.

The under -18 model was launched last December. It includes “a closer set of characters available on the platform,” said the spokesperson for Kathryn Kelly. “Filters have been applied to this set to delete characters linked to sensitive or mature subjects.”

But Anand says that AI security will take more than technical adjustments. “The security of this platform is a partnership between regulators, the United States and parents,” said Anand. This is what makes her daughter discuss with such an important character. “It must remain safe for her.”

Beyond the company

The AI company market is booming. Consumers around the world spent $ 68 million on AI Company in the first half of this year, an increase of 200% compared to last year, according to an estimate cited by CNBC. The AI startups experience a market for the market: XAI published a scary and porn companion in July, and even Microsoft invoices his Copilot Chatbot as a companion of AI.

So how does the character stand out on a crowded market? He withdraws entirely.

Related Articles

Leave a Reply

Your email address will not be published. Required fields are marked *

Check Also
Close
Back to top button