A recent study conducted by Mozilla’s *Privacy Not Included project has raised significant concerns about the privacy implications of AI-driven romance chatbots. These digital companions, designed to offer solace and companionship, have been found to harvest an extensive range of personal information from users, often without explicit consent or the option to delete collected data.
The investigation covered 11 AI romance chatbots, including well-known applications like Replika, Chai, Romantic AI, EVA AI Chat Bot & Soulmate, and CrushOn.AI. Mozilla’s findings were alarming, with each of these apps receiving the *Privacy Not Included label, indicating they fall into one of the most privacy-compromising categories evaluated by Mozilla to date.
Misha Rykov, a researcher at Mozilla, criticized the marketing of these AI companions, highlighting their potential to foster dependency and loneliness under the guise of improving mental health and well-being. Instead of offering genuine companionship, these chatbots prioritize data extraction, collecting sensitive information ranging from sexual health details to medication usage and gender-affirming care practices.
The study also revealed that 90% of the evaluated apps might sell or share user data for targeted advertising among other purposes. Furthermore, more than half of these apps do not allow users to delete their data, posing a significant risk to user privacy. From a security standpoint, only one app, Genesia AI Friend & Partner, met Mozilla’s minimum security standards, raising questions about the overall safety of these digital platforms.
One of the most shocking discoveries was the extensive use of trackers within these apps. On average, AI girlfriend apps deployed 2,663 trackers per minute to collect data and share it with other companies for advertising purposes. Notably, Romantic AI used an extraordinary 24,354 trackers in just one minute, underscoring the aggressive data collection practices employed by these applications.
Expanded Coverage: