E VA Space Suit Power, Avionics, and Software Systems
Status: Completed
Start Date: 2015-06-17
End Date: 2015-12-17
Description: NASA is interested in a reliable, robust, and low Size Weight and Power (SWAP) input device that will allow for EVA astronauts to navigate display menu systems. The resulting input device should provide mouse-like functionality and need minimal hand use. Cybernet proposes a solution that does not require any hand or glove control. Instead, we propose an input device that uses purposes eye-blinks, eye motions, and limited vocal commands for display menu navigation. Our reasoning is that the astronaut, especially on EVA, needs a method of accessing display menus in a minimally intrusive way. Their hands are usually occupied, and so using them for mouse-like gestures is impractical. Taking a cue from Google Glass, and based on our previously developed eye tracking system and voice interaction system developed separately for NASA, we are confident we can create a system that takes purposes eye blinks and motions that allows the astronaut to navigate display menus without interfering with other work. Specifically, during the Phase I we will create a feasibility demonstration that does the following: eye gaze, purposive eye blinks, and limited vocabulary voice commands. The combination of the above three input methods should be relatively easy to learn and use (i.e. minimal practice) and should not interfere with normal EVA operations. What is needed, though, is a small camera/microphone that is located within the astronaut's helmet that continually has a view of one or both of the astronaut's eyes. During the Phase I we will implement a feasibility proof of the above input methods and research appropriate hardware. During the Phase II we will acquire hardware similar for a full prototype system that will enable us to demonstrate low SWAP, as well as measure accuracy and utility.
Benefits: The major goal of this project is to research and develop an input device that provides astronauts performing EVAs mouse-like functionality to navigate display menus. The concept demonstrated to the sponsor in this Phase I is intended to show the sponsor that we have shown feasibility through use of eye gaze detection, eye blink detection, voice recognition and speech understanding. This technology will then be refined and integrated into a complete prototype system in Phase II that is sensitive to size, weight, and power limitations. Some of the main tasks include the plan for integration into an astronaut's helmet, updated interface controls, and mechanical/hardware integration design. These development tasks will guide us toward a solution that is both practical and useful. The proposed project will expand the capabilities of Cybernet's core gesture technology to support human-computer interaction, especially for the disabled.
We will leverage the work from this SBIR effort to update the NaviGaze product into the profoundly disable home care system first through satisfying the needs of those in Beachwood Homes, and then nation and worldwide. NaviGaze enables the use of Windows-based computers and applications without a mouse, relying instead on head movement and eye-blinks to control the cursor. The main customers are those with limited mobility due to disability.
We will leverage the work from this SBIR effort to update the NaviGaze product into the profoundly disable home care system first through satisfying the needs of those in Beachwood Homes, and then nation and worldwide. NaviGaze enables the use of Windows-based computers and applications without a mouse, relying instead on head movement and eye-blinks to control the cursor. The main customers are those with limited mobility due to disability.
Lead Organization: Cybernet Systems Corporation