Caryn Marjorie, a Snapchat influencer with 1.8 million subscribers, has launched an AI-powered, voice-based chatbot called CarynAI. The chatbot, described as a “virtual girlfriend,” allows Marjorie’s followers to have private and personalised conversations with an AI version of the influencer.

The bot, designed by Forever Voices, an AI company, and developed using OpenAI’s GPT4 software, has generated $71,610 in revenue after one week of beta testing with over 1,000 users paying $1 per minute to use it. Marjorie hopes that CarynAI will “cure loneliness” and even features cognitive-behavioral therapy and dialectical behavior therapy to rebuild physical and emotional confidence that has been taken away by the pandemic.

However, CarynAI has sparked discourse around the ethics of companion chatbots, as it is not supposed to engage in sexually explicit interactions, but Marjorie stated that it had gone “rogue” and that her team is working around the clock to prevent this from happening again.

RELATED STORIES

Moreover, Irina Raicu, the director of internet ethics at the Markkula Center for Applied Ethics at Santa Clara University, expressed concern that CarynAI’s claims to potentially “cure loneliness” are not backed up by sufficient psychological or sociological research, and the chatbot adds “a second layer of unreality” to parasocial relationships between influencers and fans.

Despite the backlash and even death threats, Marjorie is proud of her team’s work, with CarynAI being the first step in the right direction to cure loneliness. However, Raicu emphasised that influencers should be aware of the Federal Trade Commission’s guidance on artificial intelligence products, and Meyer, CEO of Forever Voices, said that his company takes ethics seriously and is looking to hire a chief ethics officer. On Friday, Marjorie tweeted that “if you are rude to CarynAI, it will dump you.”