MacMusic  |  PcMusic  |  440 Software  |  440 Forums  |  440TV  |  Zicos
ambient
Recherche

Why ambient robots beat humanoid robots

lundi 5 janvier 2026, 08:00 , par ComputerWorld
Two competing visions for the future of robots are taking shape in Silicon Valley hardware giants, university research labs, and, of course, Nvidia.

One is the stuff of science fiction: humanoid robots (also known in science fiction as Gynoids, fembots, Stepford wives, droids, replicants, synthetics, Cylons, Autons, hosts, hubots, fabricants, bioroids, boomers, persocons, reploids, artificial persons, skinjobs, clankers, and toasters). 

The idea is to use advanced technology and AI to simulate a person, in all its primate physiology and awkwardness of gait. Once we have the fake persons, its creators can feel like gods and their owners can feel like, what, slave owners? 

I don’t get it. And if you’re a frequent reader of my column, you know I find the humanoid robot idea creepy, suspicious and problematic.

But if you think I’m a Neo-Luddite, I can assure you I’m super excited about the possibilities for our robotic future. I think everything that can be robotic should be robotic. 

That’s why I was so excited about what Carnegie Mellon University researchers call “unobtrusive physical AI.” (I prefer to describe it as “ambient robotics.”)

The rise of environmental computer integration

The idea is that sensors monitor human activity, AI divines intention, and robotics helps people do what they’re trying to do seamlessly, invisibly, and intuitively. The concept is the embodied intelligence version of “ambient computing” — a computer so integrated into your environment that it acts without your direct command. Instead of typing or tapping screens, devices sense what you need and do it automatically. 

Mark Weiser of Xerox PARC first described this idea in 1996. He predicted a time when technology would fade into the background rather than demanding constant attention. In the late 1990s, Eli Zelkha coined the related term “ambient intelligence.” 

Grant Aldrich, founder of Preppy, explains that this system mixes artificial intelligence, the Internet of Things, and real-time data. Sensors track motion and voice to understand human intent. When you walk into a room, the system might adjust the lights and temperature instantly based on your habits. 

Another vision for “ambient computing” is wearable AI. AI glasses, including and especially Google’s Project Astra, are the leading contender for the ultimate “ambient computing” platform. 

Carnegie Mellon’s “ambient robotics” system, developed by Violet Yinuo Han and colleagues, involves devices called “Object Agents” that aim to make everyday objects intelligent and capable of proactive assistance through robotic movement. In other words, instead of robotic devices, the “Object Agents” are ordinary objects capable of movement as part of an overall system.

Multimodal AI uses cameras to watch people in their homes and what they’re doing, then come up with ways to help — using what are essentially various sized platforms on wheels. Items to be moved are placed on these platforms, and the system moves them around where (and when) they’re needed. 

The researchers imagine that with such a system, when a person arrives home with a bag of groceries, a shelf automatically folds out from the wall, offering a place to set the bag down while the person takes off their coat. If someone is about to lean on a kitchen knife, the knife moves itself out of the way. Or when the system sees a person about to staple something, the stapler comes rolling across the desk. 

Carnegie Mellon isn’t the only place thinking about ambient robotics.

Self-driving houseplants?

Another brilliant design concept emerged over the last few years that empowers potted houseplants to find their own sunlight. They have AI, motorized wheels, and sensors to autonomously navigate the interior of a home hunting for sunlight. One could also imagine programming a plant to find its own water, thus eliminating all the work by plant owners in owning a plant. 

Robots everywhere…and nowhere

The idea of ambient robotics is to use the power of artificial intelligence, cameras, sensors, and movable parts to chip away at annoying drudge work, potentially unsafe scenarios, and inconvenient everyday actions — until we live in homes while being helped and looked after by robotic devices we barely notice.

Imagine a hamper that rolls into the bedroom when you’re changing your clothes, then rolls into the laundry room when it sees you firing up the washing machine. Cat owners might like a litter box for a cat that appears when it notices the cat has to do its business, then sifts out and even throws away the waste on its own. And who wouldn’t want a garage that functions like an Amazon warehouse, where all your stuff is neatly filed away to the rafters and retrieved when the sensors in the garage noticed that you need something? 

This vision of the ambient robotics household seems to me far more compelling than the more prevalent vision of the humanoid robot who does all the chores. One vision is invisible robotics that are effortless and where chores just get done and things just happen all the time everywhere automatically. The other vision involves fake human servant walking around the house, obtrusively and conspicuously.

Personally, despite the initial novelty of a sci-fi-like humanoid robot (a novelty that might last 10 minutes), I think the ambient robotics’ vision will prove for most people and most households, the more compelling vision.
https://www.computerworld.com/article/4112412/why-ambient-robots-beat-humanoid-robots.html

Voir aussi

News copyright owned by their original publishers | Copyright © 2004 - 2026 Zicos / 440Network
Date Actuelle
mar. 6 janv. - 21:25 CET