Playground
This is a collection of work samples and experiments that reflect how I approach design: quick, scrappy, and hands-on. Each one explores different tools and media, from physical computing to interactive storytelling.
BreakoutME
Context
This project came from a small observation — the red, green, and blue subpixel layout of LCD displays looks a lot like the bricks in Atari's classic Breakout game. That spark led to a playful twist on the original: a game where the blocks are made from your own pixelated webcam image.
Solution
BreakoutMe is an interactive game that turns the player's live image into RGB-colored blocks. Using a webcam, the system captures the player's photo and breaks it down into red, green, and blue rectangles, just like how pixels work on a screen. These become the bricks the player has to clear, bringing a physical layer of self into the gameplay.
Players control the paddle using a physical knob, and can change the resolution of the pixelated image by moving their hand closer or farther from the webcam. Once the game is completed, the original (unpixelated) image is revealed - a small reward that ties the whole experience together.
Prototyping Process
I built the breakout mechanic using p5.js, then modified it so each block represents a pixel from the live webcam feed.
To capture analog input, I connected a potentiometer to the CPX, which sent readings through a serial connection. Since p5.js can't read from serial ports directly, I wrote a simple Node.js server to read the serial input and pass it to p5.js via WebSocket.
For the hand tracking feature, I used ML5’s model to detect hand distance and scale the block size in real time, letting players interact using just motion.
Eye Contact Camera
Context
After COVID-19, virtual interviews became the default. But something always felt off. I found it awkward that I couldn't look participants in the eye: my webcam sat above the screen, and eye contact was always a little misaligned. I wanted to build a device that made remote interviews feel more natural, helping both researchers and participants regain a bit of the social presence that got lost in the shift to online.
Solution
This device makes remote interviews feel more natural by restoring eye contact. When the researcher looks into the device, they can speak to the participant while appearing to look directly at them.
A camera behind the device captures the researcher's image and sends it to Zoom. At the same time, the researcher sees a reflected image of the participant, allowing face-to-face eye contact, even through a screen.
The setup was shown to other HCI researchers and received positive feedback. In early testing, several researchers noted how natural it felt being able to “look down at a script, then look up at the participant” just like in an in-person interview.
Prototyping Process
The core of the device is a one-way mirror, a piece of glass that lets light pass through from one side while reflecting from the other. A camera behind the mirror captures the researcher's image, while a secondary monitor below the mirror displays the participant's video feed, which reflects back to the researcher.
I used an HDMI processor to flip the video feed so the reflection appears correct, and connected the camera output to Zoom via an image capture device. Getting the angles right was key. I calculated the spatial relation between the screen, mirror, and camera to align the participant's face with the researcher's line of sight.
Lunatic
Context
Lunatic tells the story of a moon enthusiast falling in love with and exploring the moon. The goal was to build a storytelling device that felt surreal - something that blurred the boundary between physical space and imagination.
The story and visual style were led by Sichen Liu, our lead artist. I handled the technical design and prototyped the physical device.
Solution
We built a small storytelling machine that blended fiction with reality using light, sound, and interactive buttons. As the audience explores Lunatic’s hazy memories, the experience shifts between poetic and informational, like paging through a dream you "sort of" remember.
Prototyping Process
The machine featured a mini figure, a TFT display, and a control box. A prism was used to refract the video, creating the blended reality effect the artist envisioned.
I chose the ESP32 for its compact size and stronger media support compared to Arduino. Several buttons were wired into the board, triggering changes in state based on player input, affecting the video playback, sound effects, and lighting in real time. The main story content was stored locally as video files.