Astrobotic and CMU will develop a real time distributed localization method named DALEC that combines local visual-inertial odometry with Ultra-wideband (UWB) range measurements between rovers to improve each vehicle’s localization and provide each with information about the others’ location. Because of a lack of global constraints, visual inertial odometry (VIO) will drift over time in four degrees of freedom. Range information between rovers will provide additional constraints that limit drift. To perform distributed localization, each rover will estimate its own trajectory in a factor graph of poses and receive additional condensed information from other rovers. Because the range and VIO measurements are not tightly coupled, each of the rovers can navigate on its own, and this method is inherently robust to communication outages. A key advantage of this approach is that any other constraints from sensors such as sun angle, bearing to visually observed rovers, terrain matching, and point features, can easily be added to the factor graph formulation.
The proposed technical approach is most closely related to the Decentralized Data Fusion Smoothing and Mapping algorithm (DDF-SAM2, specifically), though DDF-SAM2 assumes landmark (point) feature observations and shares the relative position of these landmarks by default. It can also be extended to share relative positions and orientations of objects rather than just the positions of landmarks. Our problem is different in that only range measurements are available from the UWB sensor, but these are measurements with error and drift models that are much simpler and do not require solving a data association or loop closing problem.
A simulation environment will be developed to assist in the development and testing of the DALEC SLAM algorithm. The simulation will produce simulated IMU, UWB, and imagery data from 4–15 rovers operating simultaneously. The simulation will then be used to test and tune the DALEC algorithm.
The proposed system has utility for applications such as JPL’s A-PUFFER robots and NeBula autonomy architecture. This technology could also support teams of heterogeneous robots, e.g., the Perseverance rover and Ingenuity helicopter. The localization system could also be made to support non-planetary spacecraft such as orbital satellites. In this formulation, relative navigation cameras and RF ranging cross-links could be used to localize satellite constellations for uses such as distributed aperture telescopy and large baseline interferometry.
Higher accuracy location estimates for Astrobotic’s CubeRovers would allow for a larger suite of commercial swarm-based missions. It is applicable to markets such as search and rescue and underwater exploration that often operate in inhospitable environments, have limited communications, and no GPS. Offshore underwater inspection and repair robots have similar constraints.