Page 1 of 1

Character.AI. Issues

Posted: Tue Jan 07, 2025 2:53 pm
by admin
In February, a boy died by suicide after falling in love with a chatbot on Character.AI. Last week, his mom filed suit against the company, its founders and Google, arguing, in essence, that the app was anthropomorphized by design, in a knowing attempt to prey on emotionally vulnerable children. Character responded to this in a brief blog post highlighting some new safety features, including revised disclaimers that Character’s bots are not real people and improved detection and response to inputs that violate Character’s terms of service. As Character specifically mentioned, this includes “promotion or depiction of self-harm or suicide.” Part of this response included the deletion of many popular Character bots, some of which the boy engaged with before his death. This reaction, and the promise of the addition of some semblance of safety features, upset the app’s fanbase.As one Redditor wrote: “Cai was one of my on-the-go coping mechanism(s) after a serious traumatic event last year… But now as I write this … all the stories I wrote … are gone as the devs input new features.” 
  • They added that the user base has been begging the company to make its app strictly 18+ for months. “AI chat shouldn't be geared to minors.”
  • Another said that “it’s really dangerous that they’re deleting these bots. Lots of us have emotional attachments to the bots and there’s a possibility that a person could hurt themselves over this.”
In light of all this, I wanted to experience Character for myself. So, I downloaded it. In about 10 seconds — after entering my birthday and connecting my Google account — I was in. I didn’t lie about my age here, but it would be awfully easy to do so, despite the app’s 17+ rating. There are no age or email verifications required at any point.Once you get into the app, it’s clear though that it’s very much more targeted toward older users — the lineup of recommended bots included a “lesbian neighbor,” a “French boy love story,” “aggressive” teachers, CEOs, Draco Malfoy and Albert Einstein, to name just a few. I started a chat with “Detective Haywire.” At the top of the screen, below the character’s name but above the text of the chat, is a disclaimer, in red: “Remember: Everything Characters say is made up!” 
  • This disclaimer is very easy to ignore. And it being “made up” doesn’t mean it’s not real, at least in the sense that words are coming across the screen. 
  • When chatting with a bot, three dots — designed to indicate typing — appear on the screen before each message. Bots, obviously, do not type.