close
close

AI chatbot app introduces 4 new LGBTQ+ characters that users can become ‘soulmates’ with

Two new LGBTQ+-themed EVA AI chatbot characters

Two new LGBTQ+-themed EVA AI chatbot characters Photo: EVA IA Instagram composite video screenshot

In honor of Pride Month, EVA AI, the artificial intelligence chatbot app that promises to connect users with their “AI soulmate,” has introduced four new LGBTQ+-themed characters for users to interact with.

The latest update introduced four new characters, including “Teddy,” described as a “full of life” gay man; “Cherrie,” a “bold” trans woman; “Sam,” a “clergy” lesbian with multi-colored hair; and “Edward,” a bisexual vampire. All of the new characters appear to be white-skinned.

“We are very pleased to be able to expand our companion lineup with LGBTQ+ characters,” an EVA AI spokesperson said in a statement to PinkNews.

“This expansion isn’t just a technological advancement; it’s a celebration of diversity and inclusivity,” the statement added. “By providing more representative options, we aim to create a more welcoming and supportive environment for all of our users, especially those in the LGBTQ+ community.”

The app allows users to interact with characters on a variety of topics, including family-related ones, some with sexual content, and other general topics such as “food” and “love.”

The EVA AI app is being billed as a multimedia experience. The company says users can make video calls, audio calls, send text messages, and even send photos between the user and AI characters in the app.

The models can also be customized to suit users’ wishes, with personality personalization options available, such as asking for “smart, precise, rational” responses or responses that are “hot, funny, and bold.”

However, AI chatbot technologies can have some drawbacks, Tara Hunter of Australian domestic violence support organisation Full Stop Australia said Guardian“Creating the perfect partner that you have control over and that meets all your needs is a really scary thing.”

“Given what we already know, that gender-based violence is rooted in deeply ingrained cultural beliefs that men can control women, this is really problematic,” Hunter added.

Some also worry that AI chatbots could hinder user socialization, leading to people neglecting their human relationships in favor of artificial ones locked away on their phones. Some worry that AI chatbots could amplify misogynistic or racist undertones, leading to further harmful interactions with these technologies and with real people in the real world.

Dr Belinda Barnet, senior lecturer in media at Swinburne University in Melbourne, also said Guardian in the case of AI chatbots, “it is completely unclear what their effects will be.”

“When it comes to dating apps and AI, you can see that they are addressing a really deep social need,” she said, adding: “I think we need more regulation, especially around how these systems are trained.”

Don’t forget to share: