Navigation
Recherche
|
Apple’s emotional lamp and the future of robots
vendredi 14 février 2025, 12:00 , par ComputerWorld
Pixar Animation Studios has an unusual logo. The basic logo is the word “Pixar.” But sometimes, an animated lamp named Luxo Jr. hops into the frame and jumps on the letter “i.” The lap exudes personality and represents Pixar’s ability to turn any object into a compelling character.
Inspired by Luxo Jr., Apple’s Machine Learning Research division decided to create a personality-expressive lamp of their own. Apple’s ELEGNT research project explores what’s possible with an expressive physical user interface for non-humanoid robots. Based on the situation and context of the user, as well as voice interaction, gestures and touch, the lamp can appear to express itself through a variety of movements, including nodding or shaking its “head,” lowering its head to convey sadness, “tail wagging” to signify excitement, “sitting down” to imply relaxation, head tilting to show curiosity, leaning forward to show interest, gazing to direct attention, adjusting speed and pausing to communicate attitudes and emotions, and moving forward or away to show interest or disinterest. It can do some of the things smartphone apps can do but with a greater sense of fun. For example, smartphone apps can remind you to drink water, but the ELEGNT can do this by physically pushing a cup of water toward you. As you can see in this video, Apple’s project is fascinating. But as with all robot makers in Silicon Valley, as far as I can tell, the company loses the plot when dealing with any robot designed to simulate human communication. In their paper, they say: “The framework integrates function-driven and expression-driven utilities, where the former focuses on finding an optimal path to achieve a physical goal state, and the latter motivates the robot to take paths that convey its internal states —s uch as intention, attention, attitude, and emotion — during human-robot interactions.” Did you catch the lie (or worse, a possibly self-delusional claim)? They’re falsely saying that their expression-driven utilities “motivate” the lamp to convey its “internal states,” and among those internal states is “emotion.” They toss out the falsehood with shocking casualness, considering how big the statement is and how formal the research paper is. If Apple had actually invented a lamp that can feel emotions, that would be the computer science event of the century, a singularity of world-historic import. It would challenge our laws and our definition of sentience, throwing into question religious and philosophical questions that have been settled for 10,000 years. (I’ve reached out to Apple for comment on this point, but haven’t heard back.) It’s clear that Apple’s lamp is programmed to move in a way that deludes users into believing that the it has internal states that it doesn’t actually have. (I admire Apple’s research; I don’t understand why companies lie about humanoid robotics and play make-believe in their research papers about what’s going on with their robots. In the future, it will be hard enough for people to understand the nature of AI and robotics without the researchers lying in formal, technical research papers.) But if you ignore the lie, Apple’s lamp research definitely sheds light on where our interaction with robots may be heading—a new category of appliance that might well be called the “emotional robot.” A key component of the research was a user study comparing how people perceived a robot using functional and expressive movements versus one that uses only functional movements. The study found that movements incorporating expressive qualities boosted user “ratings,” especially during social-oriented tasks. But when users wanted some specific useful action to take place — for example, to shine light on an object so the user could take a picture of it — study participants found the lamp’s “personality” distracting. The researchers drew upon the concept of Theory of Mind, the human ability to attribute mental states to others, to help design the lamp’s movements. Those movements were intended to simulate intention, attention, attitude, and emotion. The movements aren’t specifically human but rather the body language of a person, a monkey, or a dog — a sentient mammal generally. The biggest takeaway from Apple’s ELEGNT research is likely that neither a human-like voice nor a human-like body, head, or face is required for a robot to successfully trick a human into relating to it as a sentient being with internal thoughts, feelings, and emotions. ELEGNT is not a prototype product; it is instead a lab and social experiment. But that doesn’t mean a product based on this research will not soon be available on a desktop near you. Apple’s emotional robot Apple is developing a desktop robot project, codenamed J595, and is targeting a launch within two years. According to reports based on leaks, the robot might look a little like Apple’s iMac G4, which was a lamp-like form factor featuring a screen at the end of a moveable “arm.” The device would function like an Apple HomePod with a screen but with additional intelligence courtesy of large language model-based generative AI. The estimated $1,000 robot would provide a user interface for home smart products and doorbell cams, answer questions, display photos and incoming messages, and function as a camera and screen for FaceTime calls. But here’s the most interesting part. Although there’s no direct evidence for this claim, it makes sense for Apple to incorporate ELEGNT research into the desktop robot project. The robot is expected to move, lean, and tilt as part of its interaction with users. Apple’s next appliance might be an emotional robot. The consumer market for emotional robots The idea of a consumer electronics product advertising “personality” through physical movements isn’t new. Among others, there’s: Jibo: A social robot with expressive movements and a rotating body. Anki’s Cozmo: A small robot toy with a movable arm and LED eyes for emotional expression. Sony Aibo: A robotic dog using its entire body to express emotions. Kuri: A home robot using head tilts, eye expressions, and sounds for communication. Lovot: A companion robot from Japan expressing affection through body movements. Amazon Astro: A home robot with a periscope camera and digital eyes for engagement. The latter product is worthy of an update since I first mentioned it in 2021. Amazon discontinued its Astro for Business program on July 3, 2024, less than a year after launch. The business robots were remotely deactivated by Amazon last Sept. 25, and now Amazon is exclusively focusing on Astro for consumers. The $1,599 consumer version of Astro, introduced in 2021, is still available (by invitation only). The business market for emotional robots No major company has tried emotional robots for business except Amazon, and it killed that program. Meanwhile, the European Union’s AI Act prohibits the use of AI systems for emotion recognition in workplaces or educational settings, except in cases of medical or safety necessity. This ban became effective on Feb. 2. So, from a business, legal, and cultural standpoint, it appears that appliances that can read your emotions and respond with gestures expressing fake emotions are not imminent. We’ll see whether users bring their emoting Apple desktop robots or other emotional robots to the office. We could be facing a bring-your-own-emotional-robot movement in the workplace. BYOER beware!
https://www.computerworld.com/article/3822098/apples-emotional-lamp-and-the-future-of-robots.html
Voir aussi |
56 sources (32 en français)
Date Actuelle
mer. 19 févr. - 09:29 CET
|