Geometric Control for Autonomous Landing on Asteroid Itokawa using Visual Localization

Abstract

This paper considers the coupled orbit and attitude dynamics of a dumbbell spacecraft around an asteroid. Geometric methods are used to derive the coupled equations of motion, defined on the configuration space of the special Euclidean group, and then a nonlinear controller is designed to enable trajectory tracking of desired landing trajectories. Rather than relying on sliding mode control or optimization based methods, the proposed approach avoids the increased control utilization and computational complexity inherent in other techniques. The nonlinear controller is used to track a desired landing trajectory to the asteroid surface. A monocular imaging sensor is used to provide position and attitude estimates using visual odometry to enable relative state estimates. We demonstrate this control scheme with a landing simulation about asteroid Itokawa.

Equations of motion of a dumbbell

The first step in trying to land on an asteroid is to determine a good model for the motion. In this paper, we explicityly treat the coupled problem. This treats the combined rotational and translational motion of the spacecraft simultaneously. Typically, these seperate dynamics are decoupled or treated seperately. Howver, in the case of motion around an asteroid this is a very bad assumption.

We model the spacecraft as a rigid dumbbell. This model is composed of two rigid, spherical masses connected by a massless rod. While a very simple model, it captures the main interactions of the rotational and translational dynamics. Using the well-known techniques of Lagrangian mechanics gives us the equations of motion as:

Nonlinear controller

Using these coupled dynamics we then implement a nonlinear controller.

This controller allows us to track any desired trajectory using the control inputs and .

Simulating images

One big challenge was the capabiility to generate our own images of an asteroid from a simulated spacecraft. While there are a variety actual images of asteroids, we want a way to generate arbitrary, yet realistic, images. For this work, we utilize Blender to generate realistic images. One awesome benefit of using Blender is the ability to render images using the Python API. This allows us to incorporate the rendering directly in our simulation.

Itokwawa

Localization using ORB-SLAM2

Using these simulated images, we can then estimate the relative motion of the spacecraft. Using the ORB-SLAM2 framework allows us to both map and estimate the motion of the spacecraft from the simulated images. There is some difficulty when moving close to the body, however, this is a focus of future research.

Mappoints

Downloads

You can find the compiled version of this paper using the PDF icon at the top of this page.

BibTeX citation

@InProceedings{kulumani2017f,
  author       = {Kulumani, Shankar and Takami, Kuya and Lee, Taeyoung},
  title        = {Geometric Control for Autonomous Landing on Asteroid Itokawa using Visual Localization},
  booktitle    = {Proceedings of the AAS/AIAA Astrodynamics Specialist Conference, Stevenson, Washington},
  year         = {2017},
}