Tuesday, May 25, 2010

EyePhone: Allows User to Control a Smart Phone with Eye Movement

eye-tracking
EyePhone tracks a person’s eye relative to a phone’s screen, letting users activate applications by blinking.
It’s hard sending a text message with arms full of groceries or while wearing winter gloves. Voice control is one alternative to using your fingers, but researchers are also working on other hands-free ways to control mobile devices. A team at Dartmouth College has now created an eye-tracking system that lets a user operate a smart phone with eye movement.

Eye tracking has been used for years, primarily as a way for people with disabilities to use computers and to enable advertisers to track a person’s focus of interest. “The naturalness of gaze interaction makes eye tracking promising,” says John Hansen, an associate professor at the IT University of Copenhagen in Denmark who works on gaze tracking. “Most of the time we are looking at the information we find most interesting.”
Mobile eye tracking could be useful for all mobile phone users, says Dartmouth professor Andrew Campbell, who led the development of the new system, called EyePhone. But so far, little work has been done on eye tracking on mobile phones. This isn’t surprising–keeping track of a gaze via a mobile phone is much more challenging than on a desktop computer because both the user and the phone are moving, and the surrounding environment is so changeable.
“Existing algorithms were highly inaccurate in mobile conditions–even if you are standing and there’s a small movement in your arm, you’d get a large amount of blurring and error,” says Campbell. The Dartmouth researchers got around this with an algorithm that learns to identify a person’s eye under different conditions. During a learning phase, the system is trained to identify a person’s eye at varying distances and under different lighting. A user must calibrate the system by taking a picture of the left or right eye both indoors and outdoors.
EyePhone runs on a Nokia 810 smart phone. The program tracks the position of an eye relative to the screen (rather than where a person is looking). A user must move the phone slightly so the icon is directly in front of her eye and then select an application by blinking. The program places an “error box” virtually around an eye, and can recognize the eye as long as it doesn’t move outside of this box. The phone app divides the camera frame into nine regions and looks for the eye in one of these regions. While the eye tracking approach is rudimentary, the researchers hope to develop more sophisticated methods soon. The system is at least 76 percent accurate in daylight and while the user is standing still and 60 percent accurate when a person is walking, says Campbell.
“This is a good step forward,” says Robert Jacob, a professor of computer science at Tufts University who also works on eye tracking. “One of the problems with the cell phone is that there’s no place for the user interface. Eye tracking seems like a very clever idea.
However, Jacob points out that tracking a user’s gaze will be more challenging on a cell phone. This is because the eye barely moves when a person’s gaze shifts between items close together on a small screen.
Hansen says the Dartmouth work is interesting, but says, “it’s a hard problem they are facing here and I expect a lot of work in this area for years to come.



Click here For More..




Please Spare a second to Suggest Our Fan Page on Facebook to your friends & also follow Us On Twitter. Thank You!!


















via impact lab

Our Friends:

The Science World Gadgets Geek Top 10 Lists Top 10 Lists