TALK: Introducing Project Aria: A New Tool for Egocentric Multi-Modal AI Research
Kiran Somasundaram, a Systems Architect at Meta Reality Labs, will speak about the egocentric, multi-modal data as available on future augmented reality (AR) devices and their unique challenges and opportunities for machine perception on Friday, Oct. 6 from 10am to 11am EST in Dreese Laboratories Room 480 and also online via Zoom.
Meeting ID: 982 4773 6070
These future devices will need to be all-day wearable in a socially acceptable form-factor to support always available, context-aware and personalized AI applications. The team at Meta Reality Labs Research built the Aria device, an egocentric, multi-modal data recording and streaming device with the goal to foster and accelerate research in this area.
In this talk, Somasundaram will introduce the Aria device hardware including its sensor configuration and the corresponding software tools that enable recording and processing of such data. We will show live demos of research applications that we can enable with this device platform.
Kiran Somasundaram is a Systems Architect at Meta Reality Labs developing machine perception technologies to enable all-day wearable AR smart glasses. He received his Ph. D., in Electrical and Computer Engineering, from the University of Maryland, College Park, in 2010. Prior to joining Meta, Kiran worked at Qualcomm Research on projects across robotics and mobile AR technologies.