Perception-Aware Autonomous Exploration in Feature-Limited Environments

2026-03-16Robotics

Robotics
AI summary

The authors study how drones explore unknown places using cameras and sensors to figure out their location and map surroundings. They noticed that drones can fail if they move into areas without many visual features, causing their location tracking to get worse. To fix this, the authors designed a method that helps the drone choose spots with good visual details and smoothly turns the camera to keep tracking those features well. Their tests show this approach leads to better maps and allows drones to explore more area before their tracking gets too inaccurate.

autonomous explorationvisual-inertial odometrylocalisationmappingfeature trackingunmanned aerial vehiclestereo camerayaw trajectoryodometry driftfrontier-based exploration
Authors
Moji Shi, Rajitha de Silva, Hang Yu, Riccardo Polvara, Marija Popović
Abstract
Autonomous exploration in unknown environments typically relies on onboard state estimation for localisation and mapping. Existing exploration methods primarily maximise coverage efficiency, but often overlook that visual-inertial odometry (VIO) performance strongly depends on the availability of robust visual features. As a result, exploration policies can drive a robot into feature-sparse regions where tracking degrades, leading to odometry drift, corrupted maps, and mission failure. We propose a hierarchical perception-aware exploration framework for a stereo-equipped unmanned aerial vehicle (UAV) that explicitly couples exploration progress with feature observability. Our approach (i) associates each candidate frontier with an expected feature quality using a global feature map, and prioritises visually informative subgoals, and (ii) optimises a continuous yaw trajectory along the planned motion to maintain stable feature tracks. We evaluate our method in simulation across environments with varying texture levels and in real-world indoor experiments with largely textureless walls. Compared to baselines that ignore feature quality and/or do not optimise continuous yaw, our method maintains more reliable feature tracking, reduces odometry drift, and achieves on average 30\% higher coverage before the odometry error exceeds specified thresholds.