AI girlfriends harvest ‘creepy’ personal data, study finds

Romantic AI chatbots violate users’ privacy ‘in disturbing new ways’, researchers claim

Anthony Cuthbertson
Wednesday 14 February 2024 12:54 GMT
Comments
AI girlfriends, like Replika’s, have become increasingly popular with the rise of human-sounding generative artificial intelligence chatbots
AI girlfriends, like Replika’s, have become increasingly popular with the rise of human-sounding generative artificial intelligence chatbots (Replika)

Your support helps us to tell the story

This election is still a dead heat, according to most polls. In a fight with such wafer-thin margins, we need reporters on the ground talking to the people Trump and Harris are courting. Your support allows us to keep sending journalists to the story.

The Independent is trusted by 27 million Americans from across the entire political spectrum every month. Unlike many other quality news outlets, we choose not to lock you out of our reporting and analysis with paywalls. But quality journalism must still be paid for.

Help us keep bring these critical stories to light. Your support makes all the difference.

Popular AI girlfriends and boyfriends harvest “creepy” information and fail to meet basic privacy standards, according to new research.

Of the 11 AI chatbots reviewed by researchers at the Mozilla Foundation – including Replika and Eva AI – none met the organisation’s safety requirements. This put them “on par with the worst categories of products” that the organisation had ever reviewed for privacy.

AI chatbots offering users a romantic relationship have seen huge growth over the last year, with more than 3 billion search results for ‘AI girlfriend’ on Google. Their popularity follows the release of advanced generative artificial intelligence models like ChatGPT that are capable of coming up with human-like responses.

Mozilla noted a couple of “red flags” when it came to popular chatbots, such as not encrypting personal information to meet minimum security standards.

“To be perfectly blunt, AI girlfriends are not your friends,” said Misha Rykov, a researcher at Mozilla’s Privacy Not Included project.

“Although they are marketed as something that will enhance your mental health and well-being, they specialise in delivering dependency, loneliness, and toxicity, all while prying as much data as possible from you.”

The research was detailed in a blog post on Wednesday, published to coincide with Valentine’s Day, which warned that romantic AI chatbots violate users’ privacy “in disturbing new ways”.

The report on Eva AI Chat Bot & Soulmate, which costs around $17 per month, noted that it had a good privacy policy, yet was still “pushy” for personal information.

“Eva AI chatbot feels pretty creepy with how it really pushes users to share tonnes of personal information, even if their privacy policy seems to be one of the better ones we reviewed,” a blog post on the Mozilla Foundation’s website stated.

“And just because their privacy policy says they aren’t sharing or selling that information far and wide now, doesn’t mean that privacy policy couldn’t change in the future.”

The Independent has reached out to Eva AI and Replika for comment.

The researchers advised users of AI chatbots to not share any sensitive information with them, and to request data be deleted once they no longer use the app.

People are also advised to not give AI chatbot apps consent to constant geolocation tracking, nor give them access to a device’s photos, video or camera.

Join our commenting forum

Join thought-provoking conversations, follow other Independent readers and see their replies

Comments

Thank you for registering

Please refresh the page or navigate to another page on the site to be automatically logged inPlease refresh your browser to be logged in