
In Douglas Adams’s 1979 novel, The Hitchhiker’s Guide to the Galaxy, the reluctant galactic traveler Arthur Dent discovers that, once inserted into the ear, a small device known as the Babel fish provides instantaneous translation of every imaginable language. Arthur likens the experience, after a momentary bit of discomfort that leaves him “gasping with horror,” to “the aural equivalent of looking at a picture of two black silhouetted faces and suddenly seeing it as a picture of a white candlestick. Or of looking at a lot of coloured dots on a piece of paper which suddenly resolve themselves into the figure six and mean that your optician is going to charge you a lot of money for a new pair of glasses.” What had previously sounded like garbled nonsense became clear and precise.
I’m sure I wasn’t the only one to think of the Babel fish when Apple announced that the AirPods Pro 3 would allow users to hear translations of foreign languages in something approaching real time. And almost as soon as I thought of the Babel fish, I found myself thinking of a throwaway line in So Long, and Thanks for All the Fish, the fourth book in what Adams would later refer to as “the increasingly inaccurately named Hitchhiker’s Guide to the Galaxy trilogy.” After Arthur returns to Earth, he removes the Babel fish and places it in a bowl. “He wouldn’t be needing it anymore,” Adams writes, “except for watching foreign movies.”
What an intriguing idea. Surely any piece of technology that could translate language in real time would be a huge breakthrough that could be wonderful in all sorts of ways. It could also make subtitles irrelevant.
Do the AirPods Pro 3 accomplish that? In a word, no, for many reasons. But attempting to use them to watch a variety of foreign-language films proved revelatory in ways I did not expect.
I started this experiment by watching a film with only a few lines of simple dialogue, Albert Lamorisse’s short 1956 feature, The Red Balloon (in French, Le Ballon Rouge). If you don’t know The Red Balloon, it’s a whimsical, ultimately bittersweet fantasy in which a boy (played by Lamorisse’s son, Pascal) traipses around Paris’s Ménilmontant neighborhood accompanied by a helium-filled red balloon that often seems to have a life of its own. No one says much in the movie, and characters rarely talk over one another when they do. So I opened the translation app on my iPhone, pressed the “live translation” option, chose to translate from French to English, and popped the AirPods in my ear. (Unlike the Babel fish, they did not burrow into my brain.)
It worked, mostly. When a teacher beckons the protagonist to leave his balloon behind and enter school, his words are subtitled as “Let’s go!” Siri’s voice in my AirPods supplied the translation “Come on” a few moments later, in a tone that didn’t quite capture the intended emotions but couldn’t be called affectless, either. This seemed promising.
Later, in a scene in which bullies pursued the protagonist and his balloon, the app even picked up some insignificant phrases left untranslated by the subtitles. Other times, the app’s translation seemed to provide more literal renderings than the subtitles, failing to understand an idiomatic phrase but more or less getting the words right. When the boy said a phrase the subtitles translated as “Obey me and be good,” the app offered, “And that you are wise.” Close enough? Not really, but the translation at least seemed to be in the same ballpark as the original words (picture an MLB-sized ballpark).
Unfortunately, another word consistently tripped up the app, and it was kind of an important one: ballon, which the app interpreted alternately as “ball” or “mom.” The first is understandable: In French, a ballon can also refer to certain types of balls. But “mom” didn’t work at all unless we’ve been looking at The Red Balloon wrong all these years. Uh-oh.
A subsequent attempt to watch The 400 Blows didn’t fare much better. Could French be the issue? Wanting to save the more layered soundtracks of contemporary films until later, I decided to move on to Vittorio de Sica’s Italian neorealist classic Bicycle Thieves. Sadly, Bicycle Thieves only clarified the translation app’s limitations—I was not, unfortunately, able to lose myself in de Sica’s struggling world of post–World War II Rome.
Here, for instance, are some lines of dialogue as translated by the subtitles:
Look at this guy. I ought to run him over. Stupid jerk! They jump out in front of you and you end up in jail before you even know what’s happened! Look at this rain!
And here’s the same passage per the Apple translation:
Look at the water that.
It’s not like the app didn’t warn me. As Alessandro Cicognini’s heartrending score swelled, Siri’s voice in my ear warned: “Ambient sound levels are high. Try moving iPhone closer to the audio source to continue translation.” Since I was watching the film on my computer, I placed the phone directly next to my laptop’s speaker. This didn’t help. When the dialogue rolled out at a rapid pace, the app sometimes seemed to give up and translate a few words here and there. “Here, Bruno, an egg sandwich” became simply “Omelet.” When Antonio (Lamberto Maggiorani), the film’s protagonist, confronts the man who stole the bike, the heated exchange was rendered as “But I’ll kill her if she doesn’t want to, don’t. But look a little. I move up to.”
I’m not a tech expert. That a little device can hear words and render them in another language with even some accuracy feels borderline magical to me. But I suspect it’s the ambient noise Siri warned me about that made this still new technology so buggy. The scene corresponding with the above transcription involves multiple characters talking passionately while off-screen music plays softly in the background. And if the relatively simple, monophonic soundtracks of classic films made the translator break down, surely a more recent film would destroy it.
Unexpectedly, the results were much the same, and occasionally better, when I watched scenes from Parasite. Sure, the famous “Jessica, only child, Illinois, Chicago” moment became “Jessica’s only daughter, Elimo Ishika, the senior is Kim-Ji-mo,” but some of the dialogue was pretty close. As a first-generation attempt at live electronic translation, it felt like the earbuds-and-app combo was on the right path, if not quite there.
But, ready or not, I felt like I had to put the AirPods Pro 3 through their paces by attempting to watch a movie in a theater. I decided to try it out on a film I’d probably find confusing under any circumstances (unlike The Red Balloon, The Bicycle Thieves, and Parasite, each of which I’d seen several times before): the anime hit Demon Slayer: Kimetsu no Yaiba—The Movie: Infinity Castle, an entry in the long-running multimedia Demon Slayer franchise.
Going in, I wasn’t even sure I understood the film’s title, a situation the movie itself did little to change. And that’s on me. Demon Slayer: Kimetsu no Yaiba Infinity Castle continues a long-running story with which I didn’t even try to familiarize myself before buying a ticket. What, to me, looked like one animated battle after another with lots of shouted references to different forms of Thunder Breathing undoubtedly played differently to the many Demon Slayer fans who turned the film into a global hit. Still, while any hope that wearing the AirPods would have a Babel fish–like effect on the experience vanished pretty quickly, the app’s efforts again seem to have at least approached the Japanese dialogue’s original reading. “I’m going to kill everyone tonight, the annoying demon hunters” was how the app translated one character’s promise, words that certainly matched the actions that followed.
Each new experiment confirmed the hypothesis I had even before hearing the app stumble over the word ballon. However impressive this gadgetry might be, it doesn’t yet approach the ability to be reliably accurate, much less understand and address linguistic nuances. Translating directions to Gare du Nord isn’t the same as expressing the despair of a man trying to feed his family in the wreckage of World War II. Translation isn’t the same as transcription. It’s an act of interpretation that understands how both the original language and the language being translated work, as well as what needs to be changed and what needs to remain the same when turning one into another.
It’s an imperfect process—as anyone who, say, watched John Woo movies on VHS tapes in the ’90s knows—just as the act of shifting one’s eyes from the primary image to the bottom of the screen can sometimes feel less than ideal. But it works. Carefully translated subtitles provide the best sense of the film’s meaning, even—and sometimes especially—when the subtitles favor interpretive choices over a word-for-word translation. Reading them becomes almost reflexive in ways that can minimize the distance between the viewer and the performances, even if it usually doesn’t take as long to read the subtitles as it does for an actor to deliver a line. For Adams’s fish or a device like Star Trek’s universal translator to function, they would have to eliminate any lag between the moment the words are said and the moment they’re rendered into speech—and they would also need to capture any linguistic quirks and emotional subtleties and instantaneously render them in an exact replication of the original speaker’s voice, rather than a computerized approximation with a limited dramatic range. (Sorry, Siri.)
That future, if it even exists, seems far removed from what technology currently allows. The hope is that everyone recognizes that: One of the most troubling elements of our current moment, as evidenced by the sudden inescapability of AI images, is that many seem eager to leap past the jagged road bumps of technological progress and settle for what’s being deemed good enough while casting aside what’s already, well, pretty good. Whatever subtitles might demand of viewers, they still seem far preferable to even a less clunky bit of technology. So take it from someone who tried: When it comes to movies, it’s OK to leave the Babel fish at home.