Navigation
Recherche
|
DIY Robots: Adorable $299 Reachy Mini is Leading an Open-Source Revolution
samedi 12 juillet 2025, 01:09 , par eWeek
Something wild is happening in robotics right now. While big companies chase $100,000 humanoids that most people will never see in person, a grassroots movement is quietly building the “Android of robotics” — and you can join from your kitchen table.
Meet the tiny Reachy Mini robot Enter Reachy Mini, Hugging Face’s adorable $299 desktop robot that just became the latest face of a DIY robot revolution that’s been brewing all year. This 11-inch companion isn’t just another gadget — it’s your ticket into a world where anyone can code, build, and train their own AI-powered robots at home. Here’s the real deal on Reachy Reachy Mini is an 11-inch tall, open-source robot that you can program in Python right out of the box. Think of it as the friendly cousin of those intimidating industrial robots, but one that actually wants to hang out on your desk and maybe help with your coding projects. You can get two versions: The Lite version ($299), which needs to be plugged into your computer. The full version ($449), which has its own Raspberry Pi brain, WiFi, and battery so it can roam free (well, as free as an 11-inch desk-bot can be). What makes this little guy special: Expressive personality: Motorized head that can look around, rotating body, and animated antennas that react to conversations. AI-powered senses: Built-in camera, microphones, and speakers for visual and audio interactions. Plug-and-play behaviors: Ships with 15+ pre-built robot behaviors you can download from Hugging Face. Community-driven: Upload and share new robot behaviors with Hugging Face’s 10M+ users. DIY assembly: Comes as a kit you build yourself (great weekend project with kids). The coolest part? It integrates directly with Hugging Face’s ecosystem, meaning you can tap into state-of-the-art speech, vision, and personality models. Want your robot to have conversations? There’s a model for that. Want it to recognize objects? Yep, model for that too. The bigger picture: Plus, it’s open source everything — hardware designs, software, even simulation environments. That means the community can improve and expand on it faster than any single company could. Here’s what’s actually happening: Over the past 18 months, Hugging Face has been building LeRobot, an open-source toolkit that’s become the “Transformers library for robotics.” Think of it as the foundation that lets everyday developers train robots the same way they’d train a language model — except now the robot can actually pick up your coffee cup. The number of community-contributed robot datasets on Hugging Face is growing rapidly, creating what researchers call the “ImageNet moment for robotics.” Just like how massive image datasets revolutionized computer vision, these shared robot training videos are democratizing physical AI. For example, check out SmolVLA, Hugging Face’s breakthrough robotics AI model, is so efficient it runs on a MacBook and uses an asynchronous inference stack that decouples perception from action execution, making robots faster and more robust (paper). The hardware revolution is equally impressive For example, K-Scale Labs went from “we have an idea” to a full humanoid platform in just 12 months — with software, hardware, and training stack all open-sourced and priced affordably. Its K-Bot is 4’7″ tall, 77 lbs, with 26 high-torque motors and a 3D printed carbon fiber frame that ships with dev kit + Python SDK for $8,999. It runs on K-OS (an operating system written in Rust) and looks like a toy but works like lab-grade research hardware. The Z-Bot desktop humanoid for $999 sold out fast on Kickstarter — built for students, hackers, and hobbyists as a test run for what’s coming next. HopeJr, another full humanoid with 66 actuated degrees of freedom, costs under $3,000 and can walk and manipulate objects. What makes this a revolution and not just expensive toys? Three game-changing factors have aligned. First, the software stack is finally unified. LeRobot offers comprehensive tools for data collection, model training, and simulation environments, while K-Scale ships the entire stack: K-OS for hardware abstraction + real-time control, PyKOS as a dev-friendly Python library, K-Sim as a 100k+ samples/sec RL training engine, and EdgeVLA for onboard vision-language-action processing. It’s plug-and-play for robot intelligence with sim-to-real workflows from day one. Second, AI models got small enough to run locally. SmolVLA can follow natural language instructions for real-world manipulation tasks and is pretrained on around 500 community datasets. Despite using a small vision-language model, it skips half the VLM layers without sacrificing performance. Its async inference stack decouples perception and action prediction from execution, making robots faster to adapt and more robust to environmental changes. Third, the community flywheel is spinning. In just 12 months, LeRobot’s GitHub repository has grown from zero to over 15,000 stars, with a thriving Discord of DIY robot builders sharing training data, designs, and breakthrough moments. Enter Reachy Mini as the perfect gateway drug. At $299 for the basic version or $449 for the wireless model, it’s cheaper than most smartphones but gives you a full robotics development platform. It’s programmable in Python (JavaScript and Scratch coming soon), integrates directly with Hugging Face’s AI models and comes with 15+ pre-built behaviors you can download and modify. The robot itself is surprisingly expressive — motorized head movement, rotating body, animated antennas, plus camera, microphones, and speakers for multimodal AI interaction. You build it yourself as a kit, making it perfect for weekend projects with kids or late-night coding sessions. Now here’s where it gets really interesting: Benjamin Bolte, K-Scale Labs founder who helped ship the first Tesla Optimus perception model, believes we’re witnessing something unprecedented. His team—coming from Meta AI, Cambridge, MILA, and General Dynamics—built Figure-level hardware with just 5 engineers on less than $500K R&D budget. As Ilir Aliu wrote, ther was no VC war chest. No secrecy. Just raw execution at open-source speed. In an interview on Aliu’s podcast, Bolte explained it all. Their philosophy is brutal and simple: “Open-source everything. Let devs build the apps. Push updates weekly. Ship real hardware. Get robots in the wild. Learn faster than the incumbents.” They’re not trying to be Tesla—they’re trying to be Linux for embodied AI. Because, as Bolte puts it, “robotics is broken. Proprietary. Expensive. Slow. You either work at Tesla, Figure, or 1X. Or you don’t work on humanoids at all.” K-Scale flips that script. Translation: While companies like Figure AI chase $100,000 proprietary robots behind closed doors, the open-source community is building the foundation for robots that everyone can understand, modify, and afford. We’re seeing Figure-level hardware capabilities democratized at a fraction of the cost, with full transparency about how everything works. Speaking of closed source robotics, how is that going? Here’s a brief landscape on the top players and their latest moves. Figure AI dropped its OpenAI partnership in February 2025 for in-house AI, announced its Helix VLA model that can control full humanoid upper body, and is already shipping Figure 02 robots to paying customers. The company is targeting 12,000 humanoids per year from its new BotQ facility. 1X Technologies unveiled NEO Gamma in February 2025 and plans to test in “a few hundred to a few thousand” homes this year. Its lightweight (55 lb) robot uses a soft knit suit and tendon-driven system for safety, backed by OpenAI. Tesla Optimus hit major roadblocks — production halted in July 2025 due to overheating motors, limited battery life, and other hardware issues. The head of the program departed, and its current robots only move batteries in Tesla workshops with “less than half” human efficiency. Boston Dynamics Atlas went fully electric in 2024 and partnered with Toyota Research Institute for advanced AI, but remains focused on research and industrial demonstrations rather than consumer deployment. Agility Robotics Digit leads in commercial deployment, already working at GXO Logistics warehouses and being tested by Amazon, with a mass production facility targeting global enterprise customers. Besides Boston Dynamics, who have been the longtime leaders in humanoid robotics and Tesla, who can throw ungodly amounts of money at the problem, we here at The Neuron see Figure as the de facto leader in this space. Put simply, they just seem to have the most momentum (and these days, its looking increasingly likely that momentum is the moat…). Figure most recently “3x’d” the amount of robots it will produce over the next three months. And if you want to get a sense of what CEO Brett Adcock and Figure AI are thinking about right now, check out Brett’s recent tweet. Speech to speech, huh? Sounds like he’s looking for a model (or a bucket of engineers) to let his robots talk to humans (or each other??). The contrast here is stark: While these companies chase hundred-thousand-dollar price points and closed ecosystems, the open-source movement is democratizing robotics for everyone. The timing couldn’t be better Goldman Sachs projects the robotics market will surpass $50 billion by the mid-2020s, and Hugging Face’s co-founder predicts “at least 100k personal robots will be pre-ordered” this year. We’re entering the phase where robots transition from research labs to kitchen counters. Why this matters We’re entering the era where AI is becoming physically embodied. While everyone’s been focused on chatbots and image generators, companies like Hugging Face are quietly building the infrastructure for AI that can actually interact with the real world. This feels like the Raspberry Pi moment for robotics. Just like how that $35 computer sparked a maker revolution and created a generation of makers who grew up thinking “of course I can build my own computer,” this DIY robot wave is creating builders who’ll think “of course my robot can understand what I’m saying and help around the house.” Reachy Mini could be the gateway drug that gets thousands of developers, students, and curious tinkerers into physical AI. These are the training wheels for the next wave of AI applications. The kids building with Reachy Mini today (or in a few days, when they get their pre-orders) will be the ones deploying warehouse robots, home assistants, and care robots tomorrow. And better yet, the more the DIY ecosystem grows, the more people can build their own custom robots for their actual needs (building trust in bots, because you understand how the sausage is made). Ready to join the revolution? Reachy Mini orders start shipping late summer 2025, but you can start learning now by exploring the LeRobot community and downloading robot behaviors in simulation. Put simply, building workplace tools with AI on your computer can be fun, but building tangible, real world AI you can interact with? That’s another level of fascinating. AI is coming to your living room, and this time, you get to build it yourself. Editor’s note: This content originally ran in our sister publication, The Neuron. To read more from The Neuron, sign up for its newsletter. The post DIY Robots: Adorable $299 Reachy Mini is Leading an Open-Source Revolution appeared first on eWEEK.
https://www.eweek.com/news/hugging-face-reachy-mini-diy-robots/
Voir aussi |
56 sources (32 en français)
Date Actuelle
sam. 12 juil. - 22:48 CEST
|