View Description + uses facial recognition technology to monitor the user’s eye movements using their computer’s camera. Each time the user blinks, the system not only generates a new daily headline from the New York Times, but also initiates a timer to measure the duration the user spends reading this headline. The time taken to read the headline influences the next interaction: the longer the reading time, the more intense and noisy the sound generated upon the user's subsequent blink. The use of consumer-grade equipment to track involuntary body movements raises questions about vulnerability, privacy, hyper stimulation, and surveillance in our heavily media-influenced world. 

In a time where optimization, efficiency, and headlines dominate the collective consciousness, questions quickly arise: Does hyper-optimization come with diminishing, if not decreasing returns? How should we deliver the news, and what are the ethical concerns with the 24-hour news cyle? What implications does it have for us to live in a society that constantly produces new headlines, each seemingly more dramatic than the previous? Are there alternatives to click-bait journalism?
Technology: JavaScript, Machine Learning (Media Pipe), tone.js, CSS, HTML
View Description +
Variations on Noise is a video essay which explores the notion of noise and it’s relationship to the human body. The piece explores simple typographic forms, while the human body is driving custom software to score the work. Variations on Noise is heavily influenced by texts such as Micah Silver’s Figures of Air , Damon Kruoski’s Ways of Hearing, and Jacques Attali’s book Noise: The Political Economy of Music.

→ Variations on Noise
A central source in the video essay was the use of archival footage, selected from the Prelinger Archive, as well as the UCLA  Archive of Film. Research from archival footage included film studies of sound, scientific videos relating to audio and recording techniques, and informational videos on military and noise pollution. The use of archival footage helped inform the text, as well as the aesthetic forms.
Technology: TouchDesigner, After Effects, Ableton Live
View Description +
View Description +
Turbulence is a collection of data from Hurricane Harvey and the tragedies that surrounded the region of Houston, Texas. This project analyzed the data from the hurricane including weather patterns such as wind speed, direction, and rainfall. The data collected comes from The National Oceanic and Atmospheric Administration and The National Hurricane Center. Through this data, we created software that allows us to translate the information into midi notation in order to trigger both analog and digital synthesizers, creating an electronic symphony of sound. The sound was then used as a feedback mechanism and processed through projection-mapped visualization software. The project premiered at The Broad Art Center in Santa Monica, California, and received the John Baldasari Family Foundation Award.
Technology used: Processing, Ableton Live, Max MSP
View Description +
The Talent House is a music school located in Cypress, Texas. After meeting with Trey Willis, the owner and founder, I established a minimal identity, as well as interactive web components to establish the encouraged experimentation and freedom the school promotes. The web-site was made of multiple areas where the user can play live instruments as well as use a pop-out piano on the site. Website made in collaboration with Gabriel Drozdov.

→ The Talent House
Along with digital assets, I created physical objects for the full identity of the school. This included clothing, business cards, letterheads, and manufactured signage for the physical location.  
View Description +
On Noise is a video essay which explores the aesthetics of noise and its relationship to mass-production. The project uses custom software to drive typography based on incoming synthesizer data. Through this technique, the project uses noise and rhythm to tell a narrative. The video essay was featured in the Sol Koffler Gallery as part of the RISD Graphic Dedsign MFA Biennial 2023.

→ On Noise
Technology: TouchDesigner, After Effects, Ableton Live
View Description +
Notes on the Mechanical Eye is a step sequencer and generative typographic composition tool made for the browser using JavaScript and Tone.js. The project uses Dziga Vertiov’s words on the mechanical eye, introducing photography to the world and how subjectivity changed when the video was created. The tool is tuned to a pentatonic scale, which creates generative music compositions as well as typographic forms.

→ Notes on the Mechanical Eye
This project was made at the Rhode Island School of Design under the guidance of Christopher and Kathleen Sleboda.
Recordings from website outputs
Technology: JavaScript, tone.js, CSS, HTML