I’ve never liked the term hallucination for AI errors. They’re lies. Last week, when I listened to my Spotify Wrapped report for the year, I was shocked by this whopper: In the 4-minute commentary excerpted below (two seconds of it), Miley Cyrus’s hit Flowers was attributed to Meghan Trainor!
Yes, I’ve listened in 2024 to both that song and many of Meghan’s. But for AI to conflate the two artists whose styles are so different — on an app whose only data is designed to keep these two artists’ works clearly separate — this was frankly unsettling. Yes, I’ve read all the stories about LLMs deceiving developers about its views, claiming in Apple news summaries that the BBC reported that the United Healthcare assassin shot himself (“Apple declined to comment” on that whopper), and even encouraging a journalist to leave his wife to run off with the bot. Alarming stuff.
Yet somehow this demonstration of how casually AI can toss off non-facts was more unsettling.
I won’t be dedicating this blog to mistakes made by technology. But I will be occasionally musing about what these types of mistakes can mean for us as humans.