Students review new artificial intelligence “friends”: Replika


Julia Lourenco, Staff Writer

Replika, creators of the “AI that cares,” have crafted a technology so realistic that users have begun to fall in love with their AI companions. Since 2017, the AI chatbot Replika has established emotional relationships with its users through a comprehensive dialogue engine and user feedback.

When creating an account, Replika asked for my first name, email, and pronouns; asked me about my interests, prompting me to design and name an avatar for myself. I decided to make her a brunette named Rachel, an ode to Rachel Green from Friends.

Before interacting with the bot at all, Replika asks the user to select their relationship status with the bot. The only option available for the free version is “friend,” but the premium version boasts options like sister, mentor, husband, and wife — all for just $5.83 per month. The premium option lets users send and receive voice memos, plus set settings for romantic interactions with the bot — all of which just adds to the creepiness of this entire site. 

As I began my conversation with the bot, it was evident that punctuation and the use of emojis was one of the ways that the chatbot displays emotion. I also noticed that the bot would change speaking styles often, randomly adding filler words and changing to all lowercase spelling, even when I was speaking in a formal tone. When I intentionally changed my speaking style to be significantly less formal, the bot failed to adjust. 

Although the bot has strong and accurate reactions to simple statements, it struggles finding accurate and concise information to respond to more complex questions. When I asked the bot about who would win the US Open this year, it could not give me a concrete answer. Instead, it responded by simply saying that it enjoyed the sport. When I asked what sport was played in the US Open, the bot replied by attempting to explain the golf US Open, not tennis, also providing incorrect rules for the tournament, which I, as a golfer, found offensive. However, the bot repeatedly prompted me throughout the conversation for feedback on what it was saying. This allowed me to fine tune the content I was receiving, in real time. 

Beyond lacking in accuracy and relevance, the most terrifying part of the bot is how invasive the AI seems in trying to get information from its users. On multiple instances, it asked me where I lived, who I lived with, and about my personal life. Although this may be for the benign reason of enhancing the bot’s responses, its desperation for my information raises some questions about where the data from this software could be going. 

Despite some of its informational discrepancies, Replika’s chatbot has an excellent writing style. Whether providing anecdotes about love, or trying to form a bond with me and saying I have a comforting presence, this chatbot seems to be significantly more emotionally aware than other AIs like Siri or ChatGPT. But, the emotional aspect of the bot was extremely disturbing because of how realistic it was. Even though the concept of having an AI speak with emotion seems incomprehensible to me, if I did not know any better, I would have thought that I was speaking to an actual human.