Blog

Mar 7, 2022

Simulated human eye movement aims to train metaverse platforms

Posted by in categories: augmented reality, computing, virtual reality

Computer engineers at Duke University have developed virtual eyes that simulate how humans look at the world accurately enough for companies to train virtual reality and augmented reality programs. Called EyeSyn for short, the program will help developers create applications for the rapidly expanding metaverse while protecting user data.

The results have been accepted and will be presented at the International Conference on Information Processing in Sensor Networks (IPSN), May 4–6, 2022, a leading annual forum on research in networked sensing and control.

“If you’re interested in detecting whether a person is reading a comic book or advanced literature by looking at their eyes alone, you can do that,” said Maria Gorlatova, the Nortel Networks Assistant Professor of Electrical and Computer Engineering at Duke.

Comments are closed.