Neural networks are making it possible to view parts of the Moon that are perpetually shrouded by darkness.
What’s new: Valentin Bickel at ETH Zürich and colleagues devised a method called Hyper-effective Noise Removal U-net Software (HORUS) to remove noise from images of the Moon’s south pole, where direct sunlight never falls. The National Aeronautics and Space Administration (NASA) is using the denoised images to plan lunar missions that will put humans on the Moon for the first time in decades.
The challenge: The only light that strikes the lunar south pole’s craters, boulders, mounds, and crevasses comes from scant photons that reflect off Earth or nearby lunar landforms or arrive from faraway stars. An imaging system aboard NASA’s Lunar Reconnaissance Orbiter can capture features that are lit this way, but it has a tendency to detect photons where none exist. Transmitting and processing the images introduces more noise, further blurring details in the already-dim images. Removing noise optimizes the available light, making it possible to see the landscape.
How it works: The authors trained two neural networks to remove the noise from lunar images.
- Using 70,000 calibration images collected during the Lunar Reconnaissance Orbiter’s mission, a convolutional neural network (CNN) called DeStripeNet learned to generate an array of pixels that simulates camera-produced noise for a given image when fed metadata associated with that image, such as the temperature of the camera and various other pieces of hardware. Then it removed this noise by overlaying the generated pixels on the original image and subtracting their values.
- A U-Net CNN called PhotonNet was trained on modified image pairs of sunlit lunar regions. The images were artificially darkened, and one in each pair was further modified by adding noise generated by a mathematical model. This noise represented errors arising from sources such as data compression applied when transmitting images to Earth. PhotonNet learned to simulate these errors and subtracted them from the output of DeStripeNet, producing a cleaner image.
Results: HORUS removed noise from 200,000 images of the lunar surface. The authors identified possible landing sites, hazards to avoid, and evidence that some areas may contain water ice beneath the surface.
Behind the news: The Moon’s south pole is the target for NASA’s upcoming Artemis program. Artemis 1, scheduled to launch in late September, will be fully automated. Artemis 2, scheduled for 2024, aims to land humans on the Moon for the first time since NASA’s final Apollo mission in 1972.
Why it matters: NASA chose the Moon’s south pole as the target for future missions because water may be frozen at the bottoms of craters there. Water on the Moon could provide clues about the heavenly body’s origin as well as hydration, radiation shielding, and propellant for missions further out in the solar system.
We’re thinking: This AI project is out of this world!