Earlier this week, Forever Voices AI announced the addition of Kaitlin “Amouranth” Siragusa as an AI companion to the Forever Companion platform, an artificial intelligence “companion” available to fans 24/7. The self-proclaimed “World Leader in Turning Influencers into AI”, they previously launched the first “Influencer into AI Girlfriend”, but now the popular streamer is “inspiring” through an interactive voice app. It will be available for “virtual dating” and “deep discussion”. .
In an interview with Bloomberg, founder and CEO John Meyer said the company could “democratize access” to its influencer fan base, giving more people the chance to talk to Amoranth. Said I can give. I don’t really hate Amoranth for making the bag. And I think a lot of her appeal to people buying her OnlyFans is the fact that she can pay for her attention anyway. Her AI app just pretends people do, even if she’s not actually having a conversation, the robot is just giving her an impression.
What bothers me is that the app happily normalizes parasocial relationships. I think the culture surrounding streamers and, more broadly, celebrities is very toxic. The idea that AI apps profit from unhealthy dynamics drives me in the wrong direction. Fans already think her favorite Twitch streamer has an obligation to provide personal information about her life so she can buy people’s attention. Many streamers, including Amoranth himself, hide details about their personal lives and whether they have a girlfriend. This is because fans know that the object of one-sided love is already in a relationship and there is no chance. Anyway, they did) hatred erupted and people could feel ‘cheated’. Building a business around fans who want constant access to their favorite influencer only enables this dynamic, fostering a very unhealthy parasocial relationship that makes people think they “know” her. ” may make you think. Since it’s an app, it can’t be helped that it can disappear at any time.
Forever Voices AI counters this with “procedures that automatically detect different situations from mental health situations,” and that “if a conversation is too long, or if the conversation is too long, the AI can actually It slows down the conversation,” he said. Apparently someone could be poisoned. The app also features a mental health engine that detects mental health issues and, if necessary, connects you to a human therapist or a human-operated emergency hotline in real time. It’s unclear how exactly this will be implemented, but the fact is that the company seems well aware of the implications of this technology and the real-world impact it can have on users, but still I’m going to run it anyway.
People always call me a Luddite for criticizing artificial intelligence, but it’s reductive to say that criticizing new technologies can lead to fear. My constant concern is that technology is introduced without adequate research into how it impacts users, and companies are more interested in the profit than the ethical impact of their products. It means that Of course, creating an AI app that allows you to talk to influencers is a good idea for the company, as it makes money by exploiting society’s imperfections. But it’s not a good idea for us. Everything is despicable and I worry about the impact this technology will have on an already misogynistic streaming industry. But don’t worry about that. You can have a “hot gaming session” with the Amouranth bot. It’s worth it, isn’t it?
Next: I’m ready to admit it – I hated playing as Sele Junda