Snapchat’s My AI chatbot is driving people crazy.
Some users accuse the bot of secretly tracking their location, while others are desperate to take it down.
The backlash represents a blow to Snap, which is putting its trust in it AI and augmented reality to increase engagement. This follows his failed experiments In hardware, original shows and games.
“Snapchat AI is creepy,” said YouTuber Faze Rug in a tweet that has been viewed more than 500,000 times. “He told me my location and when I went to enable ‘ghost mode’, I already had it turned on.”
Snap made the My AI chatbot available to everyone earlier this month testing it on paying subscribers at the end of February. The bot is basically a customized version of OpenAI’s ChatGPT that can provide answers on a wide range of topics in a conversational way.
Snap is positioning the bot as just another one of your friends you can message for movie recommendations, add to group chats, and have a laugh with. Users can access the bot from the top of their chat tab where they normally message their friends.
However, not everyone is thrilled with its lingering presence. Other Snapchat users have echoed Faze Rug’s paranoia. There are numerous tweets where people accuse the chatbot of lying about tracking their location.
In screenshots shared by Twitter user @ChrisMinecraft_, the bot initially claims it can recommend restaurants based on IP location, then backtracks to deny having access to that information.
“Snapchat’s new AI knows your current location at all times with location services turned off,” another user tweeted.
Some frustrated users are trying to disable the bot or delete Snapchat altogether. Criticism recalls the pushback Snapchat received for its divisive 2018 redesign from celebrities and regular users.
The first two posts this month on the Snapchat subreddit on Reddit, home to more than 330,000 members, are from users asking how to remove the bot from the app.
“So Ai Chatbot just appeared in my friends list…and I don’t want him in my feed. What should I do?” reads on Reddit post by an irritated user.
“They should really get rid of it, or at least make you able to block/unblock it. I’m deleting the app because it’s so annoying,” notes another popular send on the Snapchat subreddit.
Does the Snapchat chatbot know your location?
Location tracking on Snapchat is turned off by default, which means Snap can’t pinpoint where you are without your permission.
You can give him access to this information by disabling Ghost Mode via the in-app settings.
Use this information to power features like Snap Maps. This is a virtual map that lets you see where your contacts are and view Public Snaps of people nearby, and vice versa.
As for its chatbot, the AI uses Snap Maps location to give you recommendations for nearby places. For example, if you ask My AI: “What are good Italian restaurants near me?”, it can return suggestions that are within walking distance.
Snapchat only shares city-level location and generalized distances between you and places, with the large language model (LLM) used for My AI. Simply put, an LLM is a powerful algorithm trained on web data that can recognize, summarize, translate, predict and generate text and other content, in some cases even images.
“My AI is part of Snapchat, so if a user has granted location permissions in the app, my AI may be able to provide location-based responses,” a Snapchat rep told the Evening Standard. .
“My AI is an experimental chatbot that learns over time and can occasionally produce incorrect responses. If Snapchatters experience inaccurate responses, we encourage them to report it using our in-app tool.
Snap’s My AI chatbot controversy explained
This isn’t the first time the chatbot has come under fire. Just a few weeks after its launch, Snap was forced to implement new security measures after the AI was caught dispensing advice on weed and sex to the app’s youth user base.
The concerns are emblematic of broader fears about the unchecked growth of AI. Italy recently banned ChatGPT in part because there is no way to verify the age of minors, the country’s data regulator said.
Meanwhile, both Google and Microsoft have revealed that their chatbots are prone to spreading misinformation, a phenomenon known as “hallucination”.
Even Snap admitted it: “As with all AI-based chatbots, my AI is prone to hallucinations and can be tricked into saying pretty much anything,” the company said in late February. “Please be aware of its many shortcomings and apologize in advance!”
Despite these shortcomings, Snap insists the bot is appropriate for younger users. He says that the vast majority of chatbot responses adhere to his community guidelines.
Snap also announced the initial success of the chatbot. During its trial phase, the bot received two million chat messages a day from Snapchat users on topics including movies, sports, pets and more, the company recently revealed.
But clearly not everyone likes Snapchat’s AI friend.