Link: Live translations on Meta’s smart glasses work well — until they don’t
I recently tested the Ray-Ban Meta smart glasses, capable of translating Spanish, French, or Italian directly into your ears in real-time. Enabled by a feature drop last month, it also includes AI and Shazam integration, offering live translations and conversation transcripts on your phone.
The device successfully handled a simple conversation about K-pop, translating phrases shortly after they were spoken. However, its performance drops when faced with faster or more complex speech patterns.
During a test with mixed-language dialogue, the glasses managed short switches between languages well but struggled with longer interspersed English phrases. At times, the AI would repeat phrases, causing confusion and distractions.
It also had trouble with slang and subtle language nuances, often choosing literal translations over contextual ones. For instance, "no manches" was translated to its literal meaning rather than its colloquial one, "no way!"
Despite some shortcomings, the glasses proved useful for basic tourist interactions, such as ordering in restaurants or asking for directions. However, they are not yet suitable for understanding quick, whispered, or lyrical conversations in foreign languages.
While a commendable start, true instantaneous and flawless translation reminiscent of the fictional babel fish remains in the realm of science fiction. #
--
Yoooo, this is a quick note on a link that made me go, WTF? Find all past links here.
Member discussion