ABSTRACT
Real-time relative pose (RP) estimation is a cornerstone for effective multi-agent collaboration. When conventional global positioning infrastructure such as GPS is unavailable, the use of Ultra-Wideband (UWB) technology on each agent provides a practical means to measure inter-agent range. Due to UWB's precise range measurements and robust communication capabilities, external hardware installations are not needed. However, when only a single UWB device per agent is used, the relative pose between the agents can be unobservable, resulting in a complex solution space with multiple possible RPs. This letter proposes a novel method based on an Unscented Particle Filter (UPF) that fuses single UWB ranges with visual-inertial odometry (VIO). The proposed decentralized method solves the multi-modal solution in 3D (4-DoF) for the RP when it is unobservable. Moreover, a pseudo-state is introduced to correct the rotational drift of the agents. Through simulations and experiments involving two robots, the proposed solution was shown to be competitive and less computationally expensive than state-of-the-art algorithms. Additionally, the proposed solution provides all possible relative poses from the first measurement.