OpenGaze: Gaze based Human-User interface

When we think about assistive devices, particularly in the case of wheelchairs, we usually assume that they are controlled either by joysticks or vocally. If we are well into the field, we can think BCI. However, in some cases none of these options is feasible. An alternative in cases of strong disabilities is to actually guide wheelchairs -or any other active device- with our eyes. Eye motion can be used as a pointer and gestures may be associated to mouse clicks.

Opengazer is an open source application that uses an ordinary webcam that captures your face to estimate the direction of your gaze. This information can then be passed to other applications. For example, used in conjunction with Dasher, opengazer allows you to write with your eyes. Combined with an assistive device controller, it may provide directions.



At the moment, there is just a Linux version that you have to compile yourself, though, so this is not for the technologically challenged. Also, current version is reported to be sensitive to head motion, as, of course, it changes gaze direction implicitly. However, they are working in a new one to filter this effect.

The Opengazer project is supported by Samsung and the Gatsby Foundation and by the European Commission in the context of the AEGIS project - open Accessibility Everywhere: Groundwork, Infrastructure, Standards).

posted under , |

0 comments:

Post a Comment

Newer Post Older Post Home

Recent News

-Biometrically adapted wheelchair control paper accepted in IEEE Trans. on NSRE :) -New paper on collaborative navigation in hospitals accepted in Autonomous Robots

Followers



Recent Comments