A new program currently being tested by NASA would make it possible for an astronaut onboard an orbiting spacecraft to remotely operate a robot on a planet’s surface.
The exploration concept was tested on June 17 and July 26, the US space agency announced on Friday, and is similar in nature to the way scientists are able to explore the deepest parts of the ocean from the land above. NASA believes this Surface Telerobotics technology could one day be used to allow astronauts to orbit the moon, asteroids or Mars while performing work on the surface using robotic avatars.
“The initial test was notable for achieving a number of firsts for NASA and the field of human-robotic exploration,” said Terry Fong, director of the Intelligent Robotics Group (IRG) at NASAs Ames Research Center and the project manager for the Human Exploration Telerobotics program. “Specifically, this project represents the first fully-interactive remote operation of a planetary rover by an astronaut in space.”
During the first test, Expedition 36 Flight Engineer Chris Cassidy — who was stationed on the International Space Station (ISS) at the time — remotely operated the K10 planetary rover in an outdoor robotic test area known as the Roverscape. The Roverscape, which is approximately the size of two football fields, is located at the Ames facility at Moffett Field, California.
That initial test lasted for over three hours, and during that time Cassidy successfully used the rover to complete a survey of the Roverscape’s rocky, lunar-like terrain, NASA said. The second test, completed on Friday, saw European Space Agency (ESA) astronaut and Expedition 36 Flight Engineer Luca Parmitano control the robot and begin deployment of a simulated Kapton film-based radio antenna.
According to NASA, these tests represent the first time that the agency’s open-source Robot Application Programming Interface Delegate (RAPID) robot data messaging system was used to control a robot from space. RAPID, they noted, was first developed by NASAs Human-Robotic Systems project and is a group of software data structures and routines designed to simplify the process of communicating information between various robots and their command and control systems.
Furthermore, the tests mark the first time that NASAs Ensemble-based software was used in space for telerobotics. Ensemble, which was jointly developed by researchers at Ames and the California-based Jet Propulsion Laboratory (JPL), is “an open architecture for the development, integration and deployment of mission operations software,” the agency explained. It was essentially adapted from the Eclipse Rich Client Platform (RCP) framework and has been used to support mission operations software development over the past nine years, they added.
The primary objective of the Surface Telerobotics testing is to gather engineering data from astronauts onboard the ISS, the K10 robot and the data communication links, NASA said. Doing so will allow engineers to characterize the system and validate previous ground tests. During a third and final test session next month, engineers and astronauts will inspect the deployed antenna and study human-robot interaction.
“Whereas it is common practice in undersea exploration to use a joystick and have direct control of remote submarines, the K10 robots are more intelligent.” Fong said. “Astronauts interact with the robots at a higher level, telling them where to go, and then the robot itself independently and intelligently figures out how to safely get there.
“During future missions beyond low-Earth orbit, some work will not be feasible for humans to do manually. Robots will complement human explorers, allowing astronauts to perform work via remote control from a space station, spacecraft or other habitat,” he added. “This work really tests the notion that robots can project human presence to other planetary surfaces. Ultimately, this will allow us to discover and explore dangerous and remote places, whether they’re at the bottom of the ocean or at the far reaches of our solar system.”