-->

Microsoft and HPE put AI to the test on International Space Station … with gloves

An image of a spacewalker’s gloves is analyzed for signs of wear using AI. (Microsoft Photo)

“Check my spacewalking gloves, HAL.”

The HAL 9000 computer that starred in “2001: A Space Odyssey” — and made such a mess of maintenance issues on the Discovery One spaceship — isn’t on the job on the International Space Station. But Microsoft and Hewlett Packard Enterprise are teaming up with NASA to put artificial intelligence to work on mundane orbital tasks, starting with the chore of checking spacewalkers’ gloves for wear and tear.

That’s just one of two dozen experiments in AI, cloud and edge computing that have been run on HPE’s Spaceborne Computer-2 since the hardware was sent to the space station a year ago.

“We’re bringing AI to space and empowering space developers off the planet with Azure, and it’s enabling the ability to build in the cloud and then deploy in space,” Steve Kitay, senior director of Azure Space at Microsoft, told GeekWire.

The glove-checking experiment adds a new twist to a decades-old spacewalk safety procedure. After each spacewalk, astronauts have to capture images of their gloves and download them to MIssion Control, just to make sure there are no signs of damage or contamination that might pose a hazard.

Astronauts also do visual inspections of their gloves during spacewalks — and those checks can have serious consequences. In 2007, for example, NASA had to cut a shuttle damage assessment short when a spacewalker spotted a tear in his glove.

Microsoft worked out an AI-based way to streamline the inspection process.

“What we did in partnership with HPE and NASA is, we used Custom Vision, which is part of the Azure Cognitive Services suite,” Kitay said. “What it enables is the ability to develop AI models without necessarily having a Ph.D.”

An AI model was trained on the ground by clicking through pictures showing damaged and undamaged gloves. The model was uploaded to HPE’s computer on the space station, and then put to the test with the photos and videos that are recorded after spacewalks.

“Before they’re even sent to the ground, they’re sent to the computer,” Kitay said. “The computer runs the AI algorithm very, very rapidly and is able to identify where there are areas of potential wear or damage. And it sends those concerns down to the ground, where the analysts can do further review.”

It’s up to NASA’s mission planners to decide whether they want to work the AI inspections into their space station procedures on a permanent basis, but in the longer term, Kitay said AI tools are likely to become more important as NASA sends crews beyond Earth orbit.

Astronauts may have to rely on AI for life-and-death decisions “when you’re going to the moon and Mars and beyond — where there’s limited bandwidth and high latency, yet there needs to be an understanding of damage to any of the systems or capabilities,” Kitay said.

So what about HAL? Kitay said that unlike the rogue AI in “2001,” the systems developed by NASA and its partners are designed to keep humans in the loop.

“There’s usually a big difference between what you see in the movies vs. real life,” he said. “But what is impressive is how much this technology is advancing, how much we’re building capabilities that developers can use and really empowering them to do more.”

The NASA-HPE partnership is just one of the Microsoft initiatives highlighted during this week’s 37th Space Symposium in Colorado. Here are a few others:

  • Microsoft is partnering with Thales Alenia Space on a sensor system designed for Earth observation from the International Space Station, with a focus on climate data processing.
  • In partnership with Loft Orbital, Microsoft will develop new tools for developing, testing and validating software applications that can be executed by satellites in orbit. The new capabilities for software deployment are to be brought to market on a jointly used satellite launching in 2023, Microsoft said.
  • Ball Aerospace is planning to build a series of on-orbit testbed satellites that will allow for the agile implementation of new software and hardware for U.S. government applications. The satellites will demonstrate reconfigurable on-orbit processing technologies that leverage Microsoft’s Azure Cloud.
  • Azure Space has released a reference architecture with accompanying code samples, showing how to apply AI to satellite imagery at scale. Blackshark.ai’s geospatial analytics service, known as Orca, is now available via Microsoft Azure Synapse.
  • Microsoft and Intelsat have collaborated on a technology demonstration that combines Intelsat’s satellite-based FlexEnterprise service with services provided by Microsoft’s Azure Private 5G and Azure Orbital to deliver secure, high-speed 5G networks virtually anywhere on Earth.
  • Microsoft, SES and Nokia have successfully demonstrated secure access to the Azure cloud platform over private 5G and satellite communication networks for the Australian Defense Force.
  • Omnispace and Microsoft have partnered to create a joint architecture for an Azure-centric network that could supply 5G coverage to underserved areas around the world. The hybrid network would makes use of terrestrial infrastructure as well as the satellite constellation that Omnispace is building in low Earth orbit.


from GeekWire https://www.geekwire.com/2022/microsoft-and-hpe-put-ai-to-the-test-on-international-space-station-with-gloves/

Related Posts

Subscribe Our Newsletter