TALK: Toward Human-Centered XR: Bridging Cognition and Computation
Virtual and Augmented Reality enables unprecedented possibilities for displaying virtual content, sensing physical surroundings, and tracking human behaviors with high fidelity. However, we still haven't created "superhumans" who can outperform what we are in physical reality, nor a "perfect" XR system that delivers infinite battery life or realistic sensation.
Prof. Qi Sun from NYU will discuss some of his recent research on leveraging eye/muscular sensing and learning to model our perception, reaction, and sensation in virtual environments, Sept. 27 from 10:20 to 11:30 am in Dreese Labs 395N conference room or via Zoom ( https://go.osu.edu/jianchen)
Based on the knowledge, we create just-in-time visual content that jointly optimizes human (such as reaction speed to events) and system performance (such as reduced display power consumption) in XR.
Qi Sun is an assistant professor at New York University. Before joining NYU, he was a research scientist at Adobe Research. He received his PhD at Stony Brook University. His research interests lie in perceptual computer graphics, VR/AR, computational cognition, and computational displays. He is a recipient of the IEEE Virtual Reality Best Dissertation Award, with his research recognized as several best paper awards and honorable mention awards in ACM SIGGRAPH, IEEE ISMAR, and IEEE VIS.