Which sensor fusion methods improve autonomous rendezvous in GNSS-denied space?

Autonomous rendezvous in GNSS-denied space depends on robust multi-sensor processing to estimate relative pose and velocity when satellite navigation is unavailable. Causes for GNSS denial include intentional jamming, signal obstruction in cislunar environments, or operations near other bodies. The consequences affect mission safety, collision risk, and the feasibility of on-orbit servicing or debris removal. Proven research and operational practice emphasize combining complementary sensors and probabilistic estimators to reduce drift and increase fault tolerance.

Sensor fusion architectures that work

Two architectural approaches dominate: loosely-coupled fusion where each sensor produces its own pose estimate that is then fused, and tightly-coupled fusion where raw measurements are integrated together. Davide Scaramuzza University of Zurich has shown in terrestrial and aerial systems that visual-inertial odometry benefits markedly from tight coupling between camera imagery and high-rate inertial measurement units because the IMU constrains short-term motion while the camera corrects long-term drift. In space rendezvous, cameras detect relative features and IMUs provide propagation during brief occlusions.

Estimators and optimization methods

Probabilistic estimators adapted from terrestrial SLAM are central. Hugh Durrant-Whyte University of Sydney and Tim Bailey University of Sydney laid foundational work on simultaneous localization and mapping that informs filter design. The Extended Kalman Filter and Unscented Kalman Filter remain common for real-time filtering, while particle filters address highly non-Gaussian uncertainties during fast maneuvers. For batch or smoothing-based solutions, factor graph optimization implemented via GTSAM developed by Frank Dellaert Georgia Institute of Technology produces globally consistent trajectories by relinearizing over many observations and is attractive for post-contact or cooperative maneuvers where computational latency is acceptable.

LiDAR and radar sensors add resilience in low-light or feature-poor scenes. NASA Jet Propulsion Laboratory demonstrates that LiDAR-vision fusion provides precise range and geometry that cameras alone cannot in specular or textureless spacecraft surfaces. Radar can further supply coarse ranging and Doppler when optical sensors are degraded, improving robustness to environmental conditions and extending operational envelopes.

Nuances include computational limits on small satellites, illumination variability in cislunar space, and the need for certification for safety-critical proximity operations. Cultural and territorial implications arise as advanced rendezvous capabilities enable commercial servicing and debris mitigation but also raise dual-use concerns for national security. Effective systems therefore combine redundant sensors, tightly-coupled algorithms for low latency, and smoothing or batch optimization for global consistency.