/* ---- Google Analytics Code Below */

Friday, April 01, 2022

Training the Metaverse with Sim Eye Movements

Simulated Human Eye Movement Aims to Train Metaverse Platforms

Duke University Pratt School of Engineering, Ken Kingery,  March 7, 2022

Virtual eyes developed by computer engineers at Duke University could be used to train virtual reality and augmented reality programs for the metaverse. The program, called EyeSyn, accurately simulates how humans look at the world while also protecting user data. The researchers relied on cognitive science literature on how humans see the world and process visual information to develop the program. Tests showed that EyeSyn closely matched the distinct patterns of actual gaze signals and simulated the different ways people's eyes react. EyeSyn eliminates the need for companies building platforms and software in the metaverse to collect data on how peoples' eyes move during various activities. "If you give EyeSyn a lot of different inputs and run it enough times, you'll create a data set of synthetic eye movements that is large enough to train a (machine learning) classifier for a new program," said Duke's Maria Gorlatova. ... 

No comments: