NASA STTR 2017 Solicitation


PROPOSAL NUMBER: 171 T11.02-9854
RESEARCH SUBTOPIC TITLE: Distributed Spacecraft Missions (DSM) Technology Framework
PROPOSAL TITLE: Vision-Based Navigation for Formation Flight onboard ISS

NAME: Jaycon Systems NAME: Florida Institute of Technology
STREET: 801 E Hibiscus STREET: 150 W. University Blvd.
CITY: Melbourne CITY: Melbourne
STATE/ZIP: FL  32901 - 3252 STATE/ZIP: FL  32901 - 6975
PHONE: (888) 226-4711 PHONE: (321) 674-8000

PRINCIPAL INVESTIGATOR/PROJECT MANAGER (Name, E-mail, Mail Address, City/State/Zip, Phone)
Dr. Hector Gutierrez
150 W. University Blvd.
Melbourne, FL 32901 - 6975
(321) 298-5751

CORPORATE/BUSINESS OFFICIAL (Name, E-mail, Mail Address, City/State/Zip, Phone)
Mr. Jiten Chandiramani
801 E Hibiscus
Melbourne, FL 32901 - 3252
(321) 505-4560

Estimated Technology Readiness Level (TRL) at beginning and end of contract:
Begin: 4
End: 5

Technology Available (TAV) Subtopics
Distributed Spacecraft Missions (DSM) Technology Framework is a Technology Available (TAV) subtopic that includes NASA Intellectual Property (IP). Do you plan to use the NASA IP under the award?

TECHNICAL ABSTRACT (Limit 2000 characters, approximately 200 words)
The RINGS project (Resonant Inductive Near-field Generation Systems) was a DARPA-funded effort to demonstrate Electromagnetic Formation Flight and wireless power transfer in microgravity. Integration inconsistencies in both hardware and software prevented the experiment from achieving its objectives during the planned test sessions. A later project supported by NASA ARC focused on the assessment, diagnostics, corrections and ground testing of RINGS, to understand the reasons for the failure of RINGS to complete its science sessions, and assess the possibility of correcting these errors in future missions. The assessment concluded that RINGS can be successfully used in future science sessions provided that a new metrology system is available to navigate RINGS in real time onboard ISS. The proposed study supports the implementation, integration and ground testing of vision-based navigation of RINGS, using the Smartphone Video Guidance Sensor (SVGS) with SPHERES (Synchronized Position Hold Engage and Reorient Experimental Satellite). SVGS was developed at NASA MSFC for application on cubesats and small satellites to enable autonomous rendezvous and capture, and formation flying. SPHERES are free-flying robots that have been used for numerous experiments on board ISS. Their metrology system is based on ultrasonic beacons, and does not operate correctly with large flyers due to multi-path signal reflections. The main objective of this study is the integration of SVGS (as vision-based position and attitude sensor) with the SPHERES GN&C environment. Successful integration will be demonstrated by 3DOF vision-based guidance, navigation and motion control experiments on a flat floor using the RINGS ground units available at Florida Tech. Performance assessment will be done by a vision-based metrology system based on data fusion using high resolution cameras. A path forward for deployment on ISS will be developed in coordination with NASA ARC.

POTENTIAL NASA COMMERCIAL APPLICATIONS (Limit 1500 characters, approximately 150 words)
(1) The proposed effort will deliver a positioning/metrology system based on smartphones that can be used for navigation and positioning control applications in space robotics.
(2) Orientation and navigation in cubesat and smallsat missions. Automatic docking and maneuvering cubesats can be used for inspection tasks. Cubesats capable of vision-based navigation can be used to perform close-up science missions.
(3) Other applications: orbital debris mitigation, cubesat or smallsat formation flying, spacecraft docking, space robotic systems.

POTENTIAL NON-NASA COMMERCIAL APPLICATIONS (Limit 1500 characters, approximately 150 words)
1. The proposed Phase I effort will deliver a positioning/metrology system well suited for navigation and positioning control applications in Robotics when vision-based feedback is desirable, such as in automated docking or inspection tasks.
2. The proposed vision-based GN&C sensor would also be well suited for positioning, navigation and visual inspection tasks in Cubesats.

TECHNOLOGY TAXONOMY MAPPING (NASA's technology taxonomy has been developed by the SBIR-STTR program to disseminate awareness of proposed and awarded R/R&D in the agency. It is a listing of over 100 technologies, sorted into broad categories, of interest to NASA.)
Command & Control
Navigation & Guidance
Relative Navigation (Interception, Docking, Formation Flying; see also Control & Monitoring; Planetary Navigation, Tracking, & Telemetry)
Robotics (see also Control & Monitoring; Sensors)

Form Generated on 04-19-17 12:45