Coda

ROLE

Interaction Designer

Developer

Presentation

TIMELINE

November 2024 -

January 2025

CONTEXT

Project advised by Kelin Zhang

Course: Digital Products and AI

Coda is an intelligent tabletop surface that leverages Computer Vision and Natural Language Processing to support any hands-on task. It provides real-time, adaptive feedback and streamlines workflows, making ideation, learning, and problem-solving more intuitive.

With advancements in AI’s vision and conversation capabilities, I wanted to build an interface embedded in a physical space to see how it converses and reacts to you. Instead of confining AI to digital interfaces, I explored how it could respond to objects, sounds, and human movements.

Coda combines object recognition, textual understanding, movement, audio, and speech to create an intuitive, multimodal interface that enhances how we interact with our environment. By recognizing objects and interpreting their context, Coda enables hands-free assistance, real-time guidance, and interactive feedback.

Coda is an AI-powered smart surface that perceives its surroundings through vision and sound, delivering intelligent, real-time task assistance. Coda imagines AI embedded into hardware, integrating computing into daily interactions.


Overview