
I gave Reachy Mini a custom 3D printed outfit, then built and deployed a live object detection app on her camera.
https://www.youtube.com/watch?v=2D_EAcDgPEI
Reachy Mini is a collaboration between Pollen Robotics, Hugging Face, and Seeed Studio. All open source, including the body files. I got a beta developer unit through the Rerun office and have been playing with it for the past few weeks.
A few things I didn't expect going in:
- The multicolor 3D printing for something like text on a curved surface is genuinely tricky to get right
- The app ecosystem is more interesting than I thought. The constraint of no hands and no legs forces creative solutions
- Running a local model vs. connecting to a cloud LLM is a real tradeoff for a home robot, especially if kids are involved
The full code walkthrough (TensorFlow + PyCharm setup) is coming to the PyCharm channel as a companion video.