Connect with us

Technology

Research Team Develops Eye, Hand Tracking Smartphone Tech

Published

on

Total hands-free control of a smartphone may now be viable in the future as a research team developed a method to use smartphones using eye movements and hand gestures in front of a camera.

The new method was done by the research team in the Future Interfaces at Carnegie Mellon University’s (CMU) Human-Computer Interaction Institute as they sought a more natural way of interacting with smartphones.

The researchers used where the eyes are focused on the smartphone to be the “precursor” for an impending action, and developed the EyeMU tool after realizing that a second hand or voice commands may be unwieldy.

The EyeMU tool will allow users to perform actions on a smartphone by controlling the phone through a gaze or by hand gestures.

This was presented in a research paper at the  International Conference on Multimodal Interaction in 2021.

While the use of gaze-tracking and hand gestures using the front camera module is not new, the existing use cases for such are limited to accessibility options and gaming.

The lead author of the paper, Andy Kong, said that a phone might be more useful if it were developed with predictability.

“Current phones only respond when we ask them for things, whether by speech, taps or button clicks,” Kong said, adding, “If the phone is widely used now, imagine how much more useful it would be if we could predict what the user wanted by analyzing gaze or other biometrics.”

Chris Harrison, an associate professor in the HCII and director of the Future Interfaces Group who participated in the paper, said that while there are existing technologies for the method, such as Apple and Google’s advancements on gaze prediction, said that “just staring at something alone doesn’t get you there.”

He added that the innovation of the project is the addition of a second modality.“The real innovation in this project is the addition of a second modality, such as flicking the phone left or right, combined with gaze prediction. That’s what makes it powerful. It seems so obvious in retrospect, but it’s a clever idea that makes EyeMU much more intuitive.” (GFB)

Continue Reading
Advertisement
Comments

Subscribe

Advertisement

Facebook

Advertisement

Ads Blocker Image Powered by Code Help Pro

It looks like you are using an adblocker

Please consider allowing ads on our site. We rely on these ads to help us grow and continue sharing our content.

OK
Powered By
Best Wordpress Adblock Detecting Plugin | CHP Adblock