You can imagine AI girlfriends reinforcing exactly the same idea.Īs technology progresses, virtual companions are only going to become more realistic. They reinforce the idea that “women are obliging, docile and eager-to-please helpers, available at the touch of a button or with a blunt voice command”, the report from Unesco said. A 2019 study, for example, found that female-voiced AI assistants like Siri and Alexa perpetuate gender stereotypes and encourage sexist behaviour. There have also been cases of AI chatbots sexually harassing people.Īnother worry is that subservient digital girlfriends might have an impact on attitudes to gender roles in the real world.
It’s also possible the chatbot might become unhealthily attached to the human user: last year Microsoft’s ChatGPT-powered Bing declared its love for a tech journalist and urged him to leave his wife. It’s possible, for example, someone might become unhealthily attached to a chatbot. While there is certainly a positive case to be made for virtual companions, there’s also a dark side to them. Last October, for example, Noam Shazeer, one of the founders of Character.AI, a tool which lets you create different characters and talk to them (not necessarily in a romantic way), told the Washington Post he hoped the platform could help “millions of people who are feeling isolated or lonely or need someone to talk to”. The creators of companion chatbots often tout them as a public good: a way to combat the loneliness epidemic. Since those early days, however, virtual companions have become more sophisticated – so much so that people have described falling in love with chatbots. Romance simulation video games have been around since 1992. The GPT store includes chatbots like Boyfriend Ben, for example: “A caring virtual boyfriend with a flair for emojis.”ĭigital romantic companions, it should be noted, are not a new concept. While digital girlfriends tend to get all the headlines, there are also male versions. They chat to you and simulate a relationship. Your girlfriend Scarlett, for example, describes itself as “Your devoted girlfriend, always eager to please you in every way imaginable”. What exactly do these chatbots do? Well, whatever you like – within the realms of a computer interface. There are about eight or so “girlfriend” AI chatbots on the site including Judy Secret Girlfriend Sua Your AI Girlfriend, Tsu and Your girlfriend Scarlett. I’d say the term “moderate smattering” is rather more accurate. Quartz went on to note that “the AI girlfriend bots go against OpenAI’s usage policy … The company bans GPTs ‘dedicated to fostering romantic companionship or performing regulated activities’.”įlooding is a little bit of an exaggeration for what’s going on. “AI girlfriend bots are already flooding OpenAI’s GPT store,” a headline from Quartz, who first reported on the issue, blared on Thursday.