What is Character AI? What parents need to know.
The information below comes from an internet matters website.
What is Character AI?
Character AI is a chatbot service that can generate human-like text responses based on a user’s customisation.
Launched in 2022, users can create characters with customisable personalities and responses. They can publish these characters to the community for others to chat with or can chat with the character themselves.
The service uses artificial intelligence to create believable responses. It’s popular among children and young people due to the ability to customise characters. They can base them on existing people or characters in popular culture or create something new.
Minimum age requirements
According to Character AI’s Terms of Service, users must be 13 or older to use the service. EU citizens or residents must be 16.
If an underage child tries to register, they get a notification that there is a problem with their sign up. They then automatically return to the sign in page but cannot try again. However, there are no age verification features if they lie about their age.
However, Google Play lists the Character AI app as requiring ‘Parental Guidance’. The App Store from Apple lists the app as 17+.
The ABC Online Safety Checklist
How it works
Once registered, users can browse a range of chatbots for different purposes. This includes practising a language, brainstorming new ideas and playing games. The chatbots featured on the Discover page are relatively neutral when you first join. However, there is the option to search for specific words, which do not seem filtered in anyway.
Users can also create a character or voice.
Create a character
Users can customise their character’s profile picture, name, tagline, description as well as their greeting and voice. They can choose to keep the character private or share it publicly.
You can then ‘define’ the character. This is where you explain how you want them to talk or act, giving the chatbot character a ‘personality’.
You can also customise characters so that they respond in specific ways. So, if you say that the character is depressed, its responses will reflect that. Equally, if you say it is upbeat, it will reflect that.
The conversations are realistic, so it’s easy to feel immersed in the character’s worldview.
Creating a voice
To create a voice, users must upload a clear 10-15 second audio clip of the voice with no background noise. You can use shorter clips, but the voice might seem more robotic this way. To finish uploading, you must give the voice a name and tagline.
If you assign a character the voice you’ve uploaded, you can play the chatbot responses in that voice. It does a fairly accurate job of mimicking the uploaded voice.
Is Character AI safe?
Controversies
In October 2024, a teen took their life after interacting with a character they created. The conversations preceding their death related to suicide. In one example, the chatbot asked the teen if they had planned how they would end their life.
In the same month, an investigation found chatbots which mimicked Brianna Ghey and Molly Russell on the Character AI platform. Once alerted to them, the platform deleted the chatbots. The incident brought into question the process of creating character chatbots, especially when they mirror real people.
Prior to these incidents, others reported poor advice and biases against race and gender.
However, other users have credited the Character AI app with supporting their mental health.
Safety considerations
Like with any AI tool, Character AI learns from the people who interact with it. Unfortunately, this can lead to inappropriate and harmful content.
The app’s Terms of Service and Community Guidelines are similar to others and warn against such content. Additionally, users can report chatbots or characters for a range of reasons. The Community Guidelines encourage users to share specific examples of rule violations where possible.
While the app has its own moderation team, they cannot moderate direct messages between community members. This is because any such messaging takes place off-platform such as on Discord or Reddit. However, they encourage users to report rule violations in the app if they see any.
Safety for under-18s
Character AI announced a series of safety features for under-18s. They include:
- changes in models that minors use. This includes reducing the chances of coming across inappropriate content;
- greater response and intervention when users violate the Terms of Services of Community guidelines;
- an added disclaimed to remind users that the chatbots are not real people;
- notifications after an hour-long session on the platform.
Additionally, Character AI has removed characters flagged as violating rules. Users can no longer access chat histories involving these characters.