NASA SBIR 2015 Solicitation

FORM B - PROPOSAL SUMMARY


PROPOSAL NUMBER: 15-1 H4.03-9054
SUBTOPIC TITLE: EVA Space Suit Power, Avionics, and Software Systems
PROPOSAL TITLE: E VA Space Suit Power, Avionics, and Software Systems

SMALL BUSINESS CONCERN (Firm Name, Mail Address, City/State/Zip, Phone)
Cybernet Systems Corporation
3885 Research Park Drive
Ann Arbor, MI 48108 - 2217
(734) 668-2567

PRINCIPAL INVESTIGATOR/PROJECT MANAGER (Name, E-mail, Mail Address, City/State/Zip, Phone)
Dr Charles J Cohen PhD, PMP
proposals@cybernet.com
3885 Research Park Drive
Ann Arbor, MI 48108 - 2217
(734) 668-2567

CORPORATE/BUSINESS OFFICIAL (Name, E-mail, Mail Address, City/State/Zip, Phone)
Norma Heller
proposals@cybernet.com
3885 Research Park Drive
Ann Arbor, MI 48108 - 2217
(734) 668-2567

Estimated Technology Readiness Level (TRL) at beginning and end of contract:
Begin: 4
End: 6

Technology Available (TAV) Subtopics
EVA Space Suit Power, Avionics, and Software Systems is a Technology Available (TAV) subtopic that includes NASA Intellectual Property (IP). Do you plan to use the NASA IP under the award?
No

TECHNICAL ABSTRACT (Limit 2000 characters, approximately 200 words)
NASA is interested in a reliable, robust, and low Size Weight and Power (SWAP) input device that will allow for EVA astronauts to navigate display menu systems. The resulting input device should provide mouse-like functionality and need minimal hand use. Cybernet proposes a solution that does not require any hand or glove control. Instead, we propose an input device that uses purposes eye-blinks, eye motions, and limited vocal commands for display menu navigation.
Our reasoning is that the astronaut, especially on EVA, needs a method of accessing display menus in a minimally intrusive way. Their hands are usually occupied, and so using them for mouse-like gestures is impractical. Taking a cue from Google Glass, and based on our previously developed eye tracking system and voice interaction system developed separately for NASA, we are confident we can create a system that takes purposes eye blinks and motions that allows the astronaut to navigate display menus without interfering with other work.
Specifically, during the Phase I we will create a feasibility demonstration that does the following: eye gaze, purposive eye blinks, and limited vocabulary voice commands.
The combination of the above three input methods should be relatively easy to learn and use (i.e. minimal practice) and should not interfere with normal EVA operations. What is needed, though, is a small camera/microphone that is located within the astronaut's helmet that continually has a view of one or both of the astronaut's eyes. During the Phase I we will implement a feasibility proof of the above input methods and research appropriate hardware. During the Phase II we will acquire hardware similar for a full prototype system that will enable us to demonstrate low SWAP, as well as measure accuracy and utility.

POTENTIAL NASA COMMERCIAL APPLICATIONS (Limit 1500 characters, approximately 150 words)
The major goal of this project is to research and develop an input device that provides astronauts performing EVAs mouse-like functionality to navigate display menus. The concept demonstrated to the sponsor in this Phase I is intended to show the sponsor that we have shown feasibility through use of eye gaze detection, eye blink detection, voice recognition and speech understanding. This technology will then be refined and integrated into a complete prototype system in Phase II that is sensitive to size, weight, and power limitations. Some of the main tasks include the plan for integration into an astronaut's helmet, updated interface controls, and mechanical/hardware integration design. These development tasks will guide us toward a solution that is both practical and useful. The proposed project will expand the capabilities of Cybernet's core gesture technology to support human-computer interaction, especially for the disabled.

POTENTIAL NON-NASA COMMERCIAL APPLICATIONS (Limit 1500 characters, approximately 150 words)
We will leverage the work from this SBIR effort to update the NaviGaze product into the profoundly disable home care system first through satisfying the needs of those in Beachwood Homes, and then nation and worldwide. NaviGaze enables the use of Windows-based computers and applications without a mouse, relying instead on head movement and eye-blinks to control the cursor. The main customers are those with limited mobility due to disability.

TECHNOLOGY TAXONOMY MAPPING (NASA's technology taxonomy has been developed by the SBIR-STTR program to disseminate awareness of proposed and awarded R/R&D in the agency. It is a listing of over 100 technologies, sorted into broad categories, of interest to NASA.)
Command & Control
Image Processing

Form Generated on 04-23-15 15:37