NASA SBIR 2011 Solicitation
FORM B - PROPOSAL SUMMARY
PROPOSAL NUMBER: |
11-1 X15.01-8981 |
SUBTOPIC TITLE: |
A New Technique for Automated Analyses of Raw Operational Videos |
PROPOSAL TITLE: |
Perception Engine for Activity Recognition and Logging |
SMALL BUSINESS CONCERN (Firm Name, Mail Address, City/State/Zip, Phone)
TRACLabs, Inc.
100 N.E. Loop 410, Suite 520
San Antonio, TX 78216 - 1234
(281) 461-7886
PRINCIPAL INVESTIGATOR/PROJECT MANAGER (Name, E-mail, Mail Address, City/State/Zip, Phone)
Patrick Beeson
pbeeson@traclabs.com
100 N.E. Loop 410, Suite 520
San Antonio, TX 78216 - 1234
(281) 461-7884 Extension :707
Estimated Technology Readiness Level (TRL) at beginning and end of contract:
Begin: 2
End: 3
TECHNICAL ABSTRACT (Limit 2000 characters, approximately 200 words)
Ten of thousands of hours of video footage already exist and countless more hours will be logged as spacecraft continue to orbit the Earth and explore the solar system. These video logs contain immeasurable amounts of useful data on crew social interactions, crew task performance, and crew-vehicle interaction. Currently, these videos must be searched and indexed by hand. This is a long process that involves many man hours of labor.
Automated video processing techniques can integrated into a comprehensive toolbox that drastically reduces the time to search and analyze videos. This would allow specific regions in a video stream to be isolated for monitoring, which can provide quick indexing for human viewing of all motion-based activity in the area of a vehicle. It could also allow the user to query for specific activities or events that occurred in this region. These could be automatically detected by software and presented directly to the user.
In support of NASA's needs, we propose to design a system that detects and tracks humans, human activity, human-station interaction, and team interactions using existing cameras and videos. Our overall objectives can be achieved by developing a suite of algorithms that can handle several key sub-challenges: 1) Robustly handling unconstrained video content and capture conditions; 2) Extracting functional descriptions of complex human events; 3) Handling ad hoc event queries effectively; 4) Operating efficiently, so the system can keep up with the flood of videos being added to current databases and provide effective interactive search over such databases.
POTENTIAL NASA COMMERCIAL APPLICATIONS (Limit 1500 characters, approximately 150 words)
Our immediate NASA application is to find key events in video logs from space station and from ground testing. Our intended users include the Habitability and Environmental Factors Division at NASA Johnson Space Center. This work could have immediate application for International Space Station (ISS). The system could be used to monitor specific areas of station that have chronic maintenance issues of unknown cause. It could be used to analyze individual patterns in crew members and highlight unusual behaviors. It could also be used to monitor crew interaction issues, both with each other and with specific hardware on station. Our system is also applicable for vehicle/habitat design issues, by analyzing video of how environments are used by crew members.
POTENTIAL NON-NASA COMMERCIAL APPLICATIONS (Limit 1500 characters, approximately 150 words)
The military is a major consumer of video analysis software. We believe that the innovations in this project will enable a general-purpose multimedia interpretation system that will dramatically improve the productivity of intelligence community analysts working at such places as National Media Exploitation Center within the Defense Intelligence Agency. While DARPA has funded Mind's Eye to analyze scenes for verbs, they have not placed the focus on a toolbox that allows humans to place themselves in-the-loop with the video analysis process. By allowing users to make adhoc queries using selected processing components on specific regions of video, human expert knowledge that has yet to be automated can be leveraged to detect novel events.
We expect to market our software to military customers. Additional non-NASA applications include activity recognition and configurable video monitoring for airport security, large factories and plants, oil exploration operations, and hospitals. The educational arena is also a potential consumer, as students and classrooms can be monitored at universities, which improves facility maintenance and potentially instructor performance. We also see civilian applications in searching for critical events in massive unconstrained video databases, such as on YouTube and Facebook.
TECHNOLOGY TAXONOMY MAPPING (NASA's technology taxonomy has been developed by the SBIR-STTR program to disseminate awareness of proposed and awarded R/R&D in the agency. It is a listing of over 100 technologies, sorted into broad categories, of interest to NASA.)
|
Diagnostics/Prognostics
Health Monitoring & Sensing (see also Sensors)
Image Analysis
Perception/Vision
|
Form Generated on 11-22-11 13:43
|