What may have seemed like a distant endeavour years ago, artificial intelligence in space exploration is now a reality.
Latest space launches from Space X and NASA and the landing of the Perseverance rover on Mars (source) are highlighting even more the fact that multi-planetary travels for humans may soon become more than just a dream or something we see in movies.
Companies like Planetary Transportation Systems (PTS), formally known as Part–Time Scientists, which I enjoy more, is a company I’ve been following since my time at Vodafone when they were working to put the first LTE base-station on the Moon (https://pts.space/partners/vodafone/).
So, how then, is AI and Robotics in space effecting our advancements and reshaping our capabilities?
Despite potentially sounding like pure sci-fi, the use of AI systems to observe, analyse, and explore outer space is nothing new. AI programmes have been used to schedule Hubble Space Telescope observations since 1993, and in 2017, a set of deep neural networks were trained to detect and classify radio signals with 95% accuracy to aid in the search of extra-terrestrial intelligence (Technology.org).
NASA has used machine learning to classify plants and solar systems similar to our own by identifying the elements present within the planet’s atmosphere. Experts are also now exploring how AI and ML can be used for interstellar navigation. In 2017, Space reported that NASA awarded a $330,000 research grant to an exciting team developing AI & blockchain technology to guide a ship amid space debris, mitigating the delay time for deep-space travel.
Much of this has been realised with NASA’s Mars rover ‘Perseverance’ and evidenced further with the upgrades to the Perseverance. Using Terrain-Relative Navigation, the Mars 2020 mission team can consider more and more interesting landing sites with far less risk (https://mars.nasa.gov/mars2020/mission/technology/).
Avoiding Hazards During Landing: This animation depicts the Terrain-Relative Navigation technique incorporated into the entry, descent, and landing for the Mars 2020 rover. By taking images of the surface during its descent, the rover can quickly determine whether it is headed toward an area of its landing zone that the mission team has determined is hazardous. If necessary, a divert manoeuvre can send it toward safer terrain. Credit: NASA/JPL-Caltech
AI systems are also used in monitoring craft and robots for predictive maintenance. Reducing machine downtime and decreasing the risk of emergency repairs by identifying them ahead of time will be even more important in outer space.
Amidst the growing use of applications above, what are the other applications of AI in space exploration? What are other space agencies doing in this area? The rest of this article will hone in further on other important areas where AI has been developed to aid in space exploration, as well as how other space agencies are adapting their technology to compete in this…space (no pun intended!)
In our current times, astronauts communicate with Earth through radio waves. While this works well enough for talking to the International Space Station due it’s low earth orbit, it will not be able to support real-time communication from further distances. To put this into perspective, radio messages from astronauts on the moon take 1.27 seconds to reach the ISS. However, communicating from Mars and beyond will mean that radio messages will take upwards of 13 minutes to send or receive a message (ESA). With the example of the Perseverance above, messages could take anywhere from 5 to 40 minutes to be relayed.
Knowing this, NASA has been developing AI that could aid in closing the gap in message delivery times between astronauts and the ISS as space exploration leads astronauts further out into the cosmos. without human intervention (NASA). This means that a cognitive radio network could suggest alternate data paths to the ground, which could prioritise and route data through multiple paths simultaneously to avoid interference and speed up transmissions. Cognitive radio’s AI could also allocate ground station downlinks hours in advance as opposed to weeks, leading to more efficient scheduling (NASA).
Finding your way through space isn’t yet as easy as pulling up a GPS-powered navigation app. But now, researchers from Frontier Development Lab (FDL) and Intel are working to change that. Their planetary navigation research was presented during an Intel-hosted event in 2016, and there was one big question that surrounded the development of an AI based system for space navigation. If we could feed an AI enough pictures of the surface of a celestial body, could a person simply take a photo of their surroundings to have the system figure out where they were, ultimately directing them to where they wanted to be?
To see if this kind of system could work, the researchers built a virtual moon. In order to do so, the team created 2.4 million images of its surface that were taken by a hypothetical rover. They then fed the images captured by these hypothetical cameras to their AI, and then used a neural network to create them into a virtual moon. According to the team, this was enough to effectively enable navigation on the virtual moon’s surface. In theory, a person standing on the Moon’s surface should be able to localise by taking pictures of their surroundings and having AI compare the real images with the simulated planet images.
The next goal for these researchers is to do the same thing with a real celestial body, which will be Mars. This is based on their assumption that they have enough satellite images to make this happen. If it works, astronauts will
The Perseverance rover takes navigation into its own hands using AI in its Autonomous Exploration for Gathering Increased Science system (AEGIS). The intelligent targeting software enables improved autonomous navigation using AI and automated sample collection using it’s new and improved SuperCam (https://www.enterpriseai.news/2021/02/19/perseverance-rover-lands-on-mars-heres-how-it-will-use-ai/
The German Aerospace Center: AI assistants in space
In 2018, the German Aerospace Center (DLR) launched the Crew Interactive Mobile Companion (CIMON). CIMON was the world’s first flying, autonomous astronaut assistant featuring AI. It has also became the new ‘crew member’ on the ISS, demonstrating the cooperation between humans and intelligent machines.
CIMON was a collaborative effort between Airbus, DLR, and IBM, and was created using 3D printing technology. It features a screen that displays a ‘face’ with human-like expressions as the astronauts communicate with it. In December 2019, a new and improved CIMON 2 was launched into space with some significant improvements over its predecessor. For instance, the newer version has been updated with the “Watson Tone Analyser” from the IBM Cloud, giving CIMON 2 the ability to assess and react to astronauts’ emotions.
Above all, this technological innovation was a means of demonstrating how humans and robots can collaborate in a space environment. CIMON 2 has been described as an empathetic conversational partner. In addition, CIMON 2 could make work more efficient on the space station, helping pass on instructions for repairs, documenting experiments and offering voice-controlled access to reference material.
A number of tests have also been carried out on CIMON 2, like its autonomous flight capabilities, voice-controlled navigation, and its ability to understand and complete various tasks. It also managed to fly to a specific point in the ISS Columbus module for the first time. Thanks to absolute navigation capabilities, CIMON 2 was able to follow verbal commands to move to a particular location, regardless of where it was to begin with.
In addition, this project aims to research whether intelligent assistants such as CIMON could help monitor their mental wellbeing. As an assistant, CIMON could support astronauts with their high workloads, thereby reducing their exposure to stress. It is safe to say that this project is laying the foundations for social assistance systems that could reduce stress resulting from isolation or group dynamics during long-term missions.
Artificial Intelligence in Space
Clearly, the concept of AI in space is no longer foreign to us, and when we talk about AI applications or the applications of its subsets, in particular machine learning and deep learning, the scope is far beyond what humans might have imagined. Human intelligence needs complementary technology to further understand the intricacies of space, and right now, AI seems to be the best model for serving that purpose.
Microsoft are even joining the mission by extending Azure cloud capabilities to the ISS to enable advanced AI and machine learning models to support new insights and research advancements. So, if moving to the cloud is still in doubt, consider the advantages you have over the 2mbps connection speeds from the ISS in delivering data driven insights into your business.
For more insights into the applications of AI in our lives, visit our blog here or get in touch below to discuss your project!