Stereo Vision for SPHERES-based Navigation and Monitoring
Status: Completed
Start Date: 2012-02-13
End Date: 2012-08-13
Description: Maintenance operations and scientific research on the International Space Station (ISS) require active monitoring. Currently the majority of monitoring and recording of data is performed by the ISS crew. These tasks, albeit relatively passive, often consume large blocks of a crew member¿¿Ωs time. In the future, it would be desirable to offload much of this observational work onto experts and technicians on the ground, enabling the ISS crew members to focus on setup, control, and other tasks requiring greater dexterity. In addition, as recent events have shown, there exists a possibility that the ISS will be uncrewed for a period of time. Flight controllers will want to have views of the ISS in cases when there are no crew. Such a remote monitoring system must be capable of providing a wide variety of camera perspectives, covering the majority of ISS's interior. It would be impractical to gain adequate coverage using a network of mounted camera systems. MIT Space Systems Laboratory developed the SPHERES (Synchronized Position Hold Engage and Reorient Experimental Satellites) to provide a platform for conducting experiments with free-flying satellites in space. We propose to develop stereo-based visual navigation and human interaction algorithms that will increase the capabilities of SPHERES and demonstrate those algorithms using a ground-based simulator. This results in more efficient and safer operation of space vehicles and frees up crew and ground control resources.
Benefits: NASA relies on crew members to monitor and maintain the ISS. If the ISS should need to be evacuated for even a short time, flight controllers will not have sufficient on-board cameras to maintain monitoring capabilities. Remotely operated, free-flying satellites on-board ISS can offer monitoring capabilities. Our technology will provide vision-based navigation for these free-flying satellites. These same vision-based navigation algorithms could also be used by Robonaut when it becomes mobile in the future. Our algorithms are also applicable to free-flying inspection robots outside of a spacecraft. These would be useful even for robotic missions. Imagine being able to inspect the stuck antenna of a probe while it's on its way to Jupiter. The same vision-based navigation algorithms are also applicable to NASA surface exploration robots such as SEV, Centaur, and MSL.
The Department of Defense (DOD) is investing heavily in remote robotic operations including unmanned ground and aerial vehicles and is beginning to equip these vehicles with sophisticated sensing systems. This sensing systems are used for Explosive Ordnance Disposal (EOD), medical operations, entering and clearing buildings, moving supplies and unloading pallets. Our technology will greatly increase the usefulness of these robots in military environments We expect substantial interest in the DOD to these kinds of technologies. We are also working with the US Army on remote medical robotics applications and have connections with Mr. Michael Beebe, who is the Medical Robotics and Unmanned Systems R\&D manager for the Telemedicine and Advanced Technology Research Center (TATRC) of the US Army. We are also investigating remote operation of robots on oil drilling platforms to reduce manpower and allow for continued operation in the face of storms that require evacuation of platform personnel. We are also investigating the automation of remotely operated underwater vehicles, such as those produced by Oceaneering, many of which need vision-based navigation technologies. This application is particularly timely after the Deepwater Horizon incident.
The Department of Defense (DOD) is investing heavily in remote robotic operations including unmanned ground and aerial vehicles and is beginning to equip these vehicles with sophisticated sensing systems. This sensing systems are used for Explosive Ordnance Disposal (EOD), medical operations, entering and clearing buildings, moving supplies and unloading pallets. Our technology will greatly increase the usefulness of these robots in military environments We expect substantial interest in the DOD to these kinds of technologies. We are also working with the US Army on remote medical robotics applications and have connections with Mr. Michael Beebe, who is the Medical Robotics and Unmanned Systems R\&D manager for the Telemedicine and Advanced Technology Research Center (TATRC) of the US Army. We are also investigating remote operation of robots on oil drilling platforms to reduce manpower and allow for continued operation in the face of storms that require evacuation of platform personnel. We are also investigating the automation of remotely operated underwater vehicles, such as those produced by Oceaneering, many of which need vision-based navigation technologies. This application is particularly timely after the Deepwater Horizon incident.
Lead Organization: TRACLabs, Inc.