Project Aria, LiveMaps & Contextual AI


Research Design / Egocentric Perception / Multi-modal AI / Contextual AI / Robotics



Establishing large-scale egocentric machine perception and real-world data foundations for AI and Robotics, and defining adaptive interaction models for contextual AI assistance in the physical world.

Meta (Reality Labs Research), Redmond

Project Aria is Meta's research platform for egocentric AI - a sensor-rich wearable device designed to capture first-person, multi-modal data at scale. LiveMaps builds on this data to create persistent, shared spatial representations of the world. As Design Lead for Contextual AI, I'm using Aria glasses to define how AI systems understand and respond to a user's physical environment, activities, and intentions in real time.

Role

Research Design Lead, Project Aria and Contextual AI

Timeline

2019 - Present

Scope

Hardware, wearables, egocentric perception, contextual AI, spatial computing, multi-modal AI, robotics

Deliverables

Strategy and vision, interaction models, speculative design, storytelling, prototypes, research presentations

A

About the project

Led research design for Project Aria and LiveMaps, defining interaction paradigms for egocentric machine perception and multi-modal AI systems. Worked across hardware, sensing, spatial computing, and machine perception teams to establish real-world data foundations for AI and Robotics.

Co-authored "Project Aria: A new tool for egocentric multi-modal AI research" (arXiv:2308.13561).

B

Contextual AI

Defining adaptive, personalized systems and interaction models for AI-driven assistance in the physical world. The work explores how AI can understand and respond to a user's environment, activities, and intentions in real time.

C

The Glasses

Project Aria glasses are sensor-rich wearable devices equipped with RGB cameras, SLAM cameras, eye tracking, spatialized microphones, barometer, and connectivity modules. Since its debut in 2020, the platform has enabled researchers worldwide to advance machine perception and AI.

Project Aria Gen 2 - Sensor layout
Project Aria Gen 2 - Sensor layout with 12MP RGB camera, 4x computer vision cameras, 7x spatial mics, eye tracking, health sensor (PPG), and more.
Project Aria Gen 1 - Sensor layout
Project Aria Gen 1 - Sensor layout showing RGB camera, SLAM cameras, eye tracking, spatialized microphones, and connectivity modules.
D

Videos

Introducing Aria Gen 2 - Next-generation research glasses advancing machine perception, contextual AI, and robotics.
Case Study: Carnegie Mellon University - Using Aria to develop NavCog, an audio wayfinding app for people who are visually impaired.
Project Aria Deep Dive - Design, hardware architecture, and sensor capabilities of the AR research glasses.
Helping People With Memory Loss - Veterans with TBIs using Aria Gen 2 for task reminders, conversation recall, and hands-free navigation.
Case Study: Envision - Spatial audio navigation and AI research for people who are blind or have low vision.
E

Resources

F

Detailed case study materials are subject to NDA

In case you're interested, ask me for a quick walkthrough.

Other selected case studies.



Meta, Redmond

Research on the Future of Mixed Reality

Research that shaped Passthrough, Workrooms VR, and Travel Mode on Quest 2/3.

Practo, Bangalore

Connected healthcare system for India

Redesigning India's largest healthcare app, unifying 5 product teams into one connected system serving 4M users.

Microsoft HoloLens, Seattle

Future of filmmaking

Creating mixed reality solutions to assist filmmakers, VFX artists, and production artists to externalize creative ideas.