Navigation
Recherche
|
Why you’ll speak 20 languages by Christmas
jeudi 20 mars 2025, 11:00 , par ComputerWorld
I live in the future, at least as far as language translation technology is concerned.
During the past couple of months, I’ve spent most of my time in Italy and Mexico. During all that time, I understood Italian and Spanish — thanks to the Live Translation feature of my Ray-Ban Meta glasses. Announced in September, “Live Translation” is based on Meta’s Llama 3.2 AI model and is currently limited to US and Canada users enrolled in Meta’s Early Access Program. The feature translates audible French, Spanish, and Italian into audible English in the glasses and typed English on the app — and shows the wearer’s English translated into the selected language. When I first arrived at the Catania airport in Sicily, I turned on Live Translation by saying, “Hey, Meta: Start Live Translation.” The first thing I heard using this feature was airport employees directing travelers. They spoke in Sicilian-accented Italian, but I heard: “European passport holders please enter this line; all others go here.” From that point on, I turned on Live Translation from time to time and was able to understand simple things people might be telling me. In a few cases, I translated my own words into Italian (first speaking in English, then reading the translation in the app in Italian). It’s not perfect. It also translates English into English (and sometimes mistranslates English to English). It can fail to translate words spoken nearby. At other times, it will translate words spoken across the room when people are talking to each other, not to me. Ray-Ban Meta glasses also do another neat translation trick. While using Live AI, another Early Access feature, you can look at a sign in a foreign language and ask what it means in English, and it will speak the English translation. Despite the language glitches, this is a clear glimpse of the future for all of us — the very near future. Apple AirPods Bloomberg reported on March 13 that Apple will add live language translation to iOS 19 for AirPod users. According to the report, the user’s AirPods capture foreign language speech and speak the English translation into the ears of the AirPod wearer. Then, when the user speaks English, the iPhone speaker plays the translation into the foreign language via Apple’s Translate app. The feature is expected to be announced at Apple’s Worldwide Developers Conference (WWDC) in June and released in the fall. The languages to be supported have not been reported, but Apple’s Translate app supports 20. And Apple is by no means first to market with language translation earbuds. Google Pixel Buds Google has included live translation through its Pixel Buds and Pixel Buds Pro earbuds since October 2017. The feature does what I described for the Apple AirPods: It delivers translated foreign-language speech through the Pixel Buds while outputting translated English words through the phone speaker. That’s what happens in Conversation Mode. When users switch to Transcribe Mode, they can get a live transcription of the translated foreign language, which is useful for listening to business presentations, attending speeches, or watching movies. The Pixel Buds’ language translation feature works via the excellent Google Translate app. In Conversation Mode, it supports more than 100 languages; Transcribe Mode, however, only supports four languages: French, German, Italian, and Spanish. Language translation requires an Android device running Android 6.0 or later that’s Google Assistant-enabled, including non-Pixel phones. However, if you do have an advanced Pixel phone, the translation gets much better. Compatible Pixel phones (especially models with a Tensor processor) offer Live Translate with text messages, through the camera, in videos, and even during phone calls. A world of translation products Language translation features that go in the ears come in many varieties. The TimeKettle WT2 Edge/W3 is highly rated. It supports 40 online languages and 13 pairs of offline languages, enabling two-way simultaneous translation that eliminates the need for alternating speech patterns. The system achieves up to 95% translation accuracy through its AI platform, according to the company. The Vasco Translator E1 supports an impressive 51 languages and uses 10 different AI-powered translation engines. The system allows up to 10 people to join conversations using the mobile app. The Pilot by Waverly Labs translates the wearer’s words to others and also translates replies back to the wearer’s language. Smart glasses that translate are also available. The Solos AirGo 3 Smart Glasses perform real-time language translation via the SolosTranslate platform and OpenAI’s ChatGPT. Brilliant Labs’ Frame AI Glasses are open-source AR glasses that can translate languages seen in the environment, recognize images and provide information about them, and search the internet for results. The glasses use augmented reality to display translations directly in the user’s field of vision. They integrate with OpenAI, Whisper, and Perplexity technologies. TCL AR Glasses can live-translate conversations, offering an integrated heads-up display for showing the translation. Other form-factors exist, too, including the TimeKettle X1, K&F Concept Language Translator Device, ili Wearable Translator, Vasco Translator E1, TimeKettle WT2 Edge, and Timekettle ZERO Language Translator. All these products demonstrate that the technology for traveling the world and being able to hold conversations, read signs and understand people in foreign languages is already here, and has been for a while. Going mainstream What’s about to change is the arrival of this feature in totally mainstream products. Something like 100 million people use their Apple AirPods almost every day. Meta expects to sell more than 10 million Ray-Ban Meta glasses by the end of 2026, by which time Live Translation and Live AI will be offered to all users globally. What’s really happening is that we’re heading for a world in which every wearable speaker — earbuds, headphones, smart glasses, and more — will give us live language translation on command or even automatically. The worst thing about this emerging trend is that, in the future, far fewer people will bother to learn foreign languages, relying instead on AI. But the upside is that language barriers between people on our planet will be essentially erased, and people will more easily understand one another. That’s got to be a good thing. In the meantime, live translation tech has been a radical and welcome game-changer for me as I travel the world as a digital nomad. Partnering with AI, I can speak foreign languages I never learned.
https://www.computerworld.com/article/3849236/why-youll-speak-20-languages-by-christmas.html
Voir aussi |
56 sources (32 en français)
Date Actuelle
dim. 23 mars - 04:52 CET
|