One Giant Leap for Lunar Landing Navigation


Reading time ( words)

When Apollo 11’s lunar module, Eagle, landed on the Moon on July 20, 1969, it first flew over an area littered with boulders before touching down at the Sea of Tranquility. The site had been selected based on photos collected over two years as part of the Lunar Orbiter program.

But the “sensors” that ensured Eagle was in a safe spot before touching down – those were the eyes of NASA Astronaut Neil Armstrong.

“Eagle’s computer didn’t have a vision-aided system to navigate relative to the lunar terrain, so Armstrong was literally looking out the window to figure out where to touch down,” said Matthew Fritz, principal investigator for a terrain relative navigation system being developed by Draper of Cambridge, Massachusetts. “Now, our system could become the ‘eyes’ for the next lunar lander module to help target the desired landing location.”

This week, that system will be tested in the desert of Mojave, California, on a launch and landing of Masten Space Systems’ Xodiac rocket. The rocket is scheduled to take off Wednesday, Sept. 11.

lunar2.jpgThe rocket flight is made possible with support from NASA’s Flight Opportunities program managed by NASA’s Armstrong Flight Research Center in Edwards, California, and the Game Changing Development program overseen by NASA’s Langley Research Center in Hampton, Virginia. It marks the first test of the system with both a descent altitude and a landing trajectory similar to what is expected on a lunar mission.

But what is terrain relative navigation? And why is it so important to NASA’s Artemis program to return American astronauts to the Moon by 2024, and future human missions to Mars?

Without capabilities like GPS, which is designed to help us navigate on Earth, determining a lander vehicle’s location is much like comparing visual cues (e.g., road signs, important buildings, notable landmarks) while driving a car with those cues identified on road maps.

“We have onboard satellite maps loaded onto the flight computer and a camera acts as our sensor,” explained Fritz. “The camera captures images as the lander flies along a trajectory and those images are overlaid onto the preloaded satellite maps that include unique terrain features. Then by mapping the features in the live images, we’re able to know where the vehicle is relative to the features on the map.”

lunar3.png

When astronauts return to the Moon by 2024, a camera-aided terrain relative navigation system will provide real-time, precise mapping of the lunar surface with images laid over preloaded satellite maps on the lander’s onboard computer. The image on the left, taken during a 2019 drone flight over California's Mojave Desert, shows terrain features identified by the navigation system's camera. These are matched to known features identified in satellite images on the onboard computer. Credits: Draper

lunar4.jpgWhile the Apollo Guidance Computer was a revolutionary feat of engineering for its time, today’s technology would certainly have been welcome assistance. With the computer sounding alarms and Eagle quickly running out of fuel, Armstrong was doing his best to find a safe parking spot.

So, it’s no surprise that NASA and commercial partners are relying on the most advanced technology to upgrade navigation for future robotic and crewed missions to the Moon. The agency is developing a suite of precision landing technologies for possible use on future commercial lunar landers. NASA is already buying services for robotic Moon deliveries and is planning to ask American companies to build the next generation human landing systems.

The agency’s work to develop navigation sensors and related technologies falls under a larger effort now referred to as SPLICE, or the Safe and Precise Landing – Integrated Capabilities Evolution project. SPLICE has evolved out of other NASA projects dating back to the early 2000s, all created to develop an integrated suite of landing and hazard avoidance capabilities for planetary missions. Contributions hail from several commercial efforts and multiple NASA centers.

Terrain relative navigation is key to the overall SPLICE effort, which also includes navigation Doppler lidar, hazard detection lidar, and a high-performance onboard computer. Working together, the full suite of capabilities promises to give future crewed missions much safer and precise descents and landings on the lunar surface.

Share

Print


Suggested Items

'Eyes' for the Autopilot

07/05/2019 | TUM
Automatic landings have long been standard procedure for commercial aircraft. While major airports have the infrastructure necessary to ensure the safe navigation of the aircraft, this is usually not the case at smaller airports.

DARPA Tests Advanced Chemical Sensors

05/03/2019 | DARPA
DARPA’s SIGMA program, which began in 2014, has demonstrated a city-scale capability for detecting radiological and nuclear threats that is now being operationally deployed.

Researchers Selected to Develop Novel Approaches to Lifelong Machine Learning

05/07/2018 | DARPA
Machine learning (ML) and artificial intelligence (AI) systems have significantly advanced in recent years. However, they are currently limited to executing only those tasks they are specifically designed to perform and are unable to adapt when encountering situations outside their programming or training.



Copyright © 2019 I-Connect007. All rights reserved.