OmniRobotHome: A Multi-Camera Platform for Real-Time Multiadic Human-Robot Interaction
2026-04-30 • Robotics
RoboticsComputer Vision and Pattern Recognition
AI summaryⓘ
The authors address the challenge of having multiple humans and robots work together at the same time in a shared home environment, which is hard because people and objects often block each other from view. They created OmniRobotHome, a system using 48 synced cameras to track humans and objects in 3D without markers, even when blocked, combined with robot arms that react in real time. This setup helps study how robots can safely interact with people and anticipate their needs by remembering human behavior over time. Their work makes studying complex multi-agent collaboration in realistic spaces easier.
Human-robot collaborationMultiadic interactionReal-time 3D trackingOcclusion-robust perceptionRGB camerasFranka robot armsBehavior modelingMulti-robot actuationShared workspaceHuman-anticipatory assistance
Authors
Junyoung Lee, Sookwan Han, Jeonghwan Kim, Inhee Lee, Mingi Choi, Jisoo Kim, Wonjung Woo, Hanbyul Joo
Abstract
Human-robot collaboration has been studied primarily in dyadic or sequential settings. However, real homes require multiadic collaboration, where multiple humans and robots share a workspace, acting concurrently on interleaved subtasks with tight spatial and temporal coupling. This regime remains underexplored because close-proximity interaction between humans, robots, and objects creates persistent occlusion and rapid state changes, making reliable real-time 3D tracking the central bottleneck. No existing platform provides the real-time, occlusion-robust, room-scale perception needed to make this regime experimentally tractable. We present OmniRobotHome, the first room-scale residential platform that unifies wide-area real-time 3D human and object perception with coordinated multi-robot actuation in a shared world frame. The system instruments a natural home environment with 48 hardware-synchronized RGB cameras for markerless, occlusion-robust tracking of multiple humans and objects, temporally aligned with two Franka arms that act on live scene state. Continuous capture within this consistent frame further supports long-horizon human behavior modeling from accumulated trajectories. The platform makes the multiadic collaboration regime experimentally tractable. We focus on two central problems: safety in shared human-robot environments and human-anticipatory robotic assistance, and show that real-time perception and accumulated behavior memory each yield measurable gains in both.