NASA SBIR 2011 Solicitation


PROPOSAL NUMBER: 11-1 O3.01-8173
SUBTOPIC TITLE: Remotely Operated Mobile Sensing Technologies for inside ISS
PROPOSAL TITLE: Stereo Vision for SPHERES-based Navigation and Monitoring

SMALL BUSINESS CONCERN (Firm Name, Mail Address, City/State/Zip, Phone)
TRACLabs, Inc.
100 N.E. Loop 410, Suite 520
San Antonio, TX 78216 - 1234
(281) 461-7886

PRINCIPAL INVESTIGATOR/PROJECT MANAGER (Name, E-mail, Mail Address, City/State/Zip, Phone)
Eric Huber
16969 N. Texas Ave. Suite 300
Webster, TX 77598 - 1234
(281) 461-7886 Extension :705

Estimated Technology Readiness Level (TRL) at beginning and end of contract:
Begin: 3
End: 4

TECHNICAL ABSTRACT (Limit 2000 characters, approximately 200 words)
Maintenance operations and scientific research on the International Space Station (ISS) require active monitoring. Currently the majority of monitoring and recording of data is performed by the ISS crew. These tasks, albeit relatively passive, often consume large blocks of a crew member??Ωs time. In the future, it would be desirable to offload much of this observational work onto experts and technicians on the ground, enabling the ISS crew members to focus on setup, control, and other tasks requiring greater dexterity. In addition, as recent events have shown, there exists a possibility that the ISS will be uncrewed for a period of time. Flight controllers will want to have views of the ISS in cases when there are no crew. Such a remote monitoring system must be capable of providing a wide variety of camera perspectives, covering the majority of ISS's interior. It would be impractical to gain adequate coverage using a network of mounted camera systems. MIT Space Systems Laboratory developed the SPHERES (Synchronized Position Hold Engage and Reorient Experimental Satellites) to provide a platform for conducting experiments with free-flying satellites in space. We propose to develop stereo-based visual navigation and human interaction algorithms that will increase the capabilities of SPHERES and demonstrate those algorithms using a ground-based simulator. This results in more efficient and safer operation of space vehicles and frees up crew and ground control resources.

POTENTIAL NASA COMMERCIAL APPLICATIONS (Limit 1500 characters, approximately 150 words)
NASA relies on crew members to monitor and maintain the ISS. If the ISS should need to be evacuated for even a short time, flight controllers will not have sufficient on-board cameras to maintain monitoring capabilities. Remotely operated, free-flying satellites on-board ISS can offer monitoring capabilities. Our technology will provide vision-based navigation for these free-flying satellites. These same vision-based navigation algorithms could also be used by Robonaut when it becomes mobile in the future. Our algorithms are also applicable to free-flying inspection robots outside of a spacecraft. These would be useful even for robotic missions. Imagine being able to inspect the stuck antenna of a probe while it's on its way to Jupiter. The same vision-based navigation algorithms are also applicable to NASA surface exploration robots such as SEV, Centaur, and MSL.

POTENTIAL NON-NASA COMMERCIAL APPLICATIONS (Limit 1500 characters, approximately 150 words)
The Department of Defense (DOD) is investing heavily in remote robotic operations including unmanned ground and aerial vehicles and is beginning to equip these vehicles with sophisticated sensing systems. This sensing systems are used for Explosive Ordnance Disposal (EOD), medical operations, entering and clearing buildings, moving supplies and unloading pallets. Our technology will greatly increase the usefulness of these robots in military environments We expect substantial interest in the DOD to these kinds of technologies. We are also working with the US Army on remote medical robotics applications and have connections with Mr. Michael Beebe, who is the Medical Robotics and Unmanned Systems R\&D manager for the Telemedicine and Advanced Technology Research Center (TATRC) of the US Army. We are also investigating remote operation of robots on oil drilling platforms to reduce manpower and allow for continued operation in the face of storms that require evacuation of platform personnel. We are also investigating the automation of remotely operated underwater vehicles, such as those produced by Oceaneering, many of which need vision-based navigation technologies. This application is particularly timely after the Deepwater Horizon incident.

TECHNOLOGY TAXONOMY MAPPING (NASA's technology taxonomy has been developed by the SBIR-STTR program to disseminate awareness of proposed and awarded R/R&D in the agency. It is a listing of over 100 technologies, sorted into broad categories, of interest to NASA.)
Autonomous Control (see also Control & Monitoring)
Man-Machine Interaction
Robotics (see also Control & Monitoring; Sensors)

Form Generated on 11-22-11 13:43