iain nash

iain’s interest in interactive media started when he worked as a researcher with the mobile and Environmental media lab on intelligent meeting room and lighting design. He worked with behanz fahari on living, breathing walls and various pieces as a technical consultant.

iain has used various embedded development platforms such as arduinos, esp-32 chips, raspberry pi, and addressable led chipsets of all kinds to create interactive, modular IoT solutions for home and installations

at usc, iain integrated his interest in film photography process and the digital screen and backlight by creating photograms using the internals of an LCD panel and laptop keyboard backlights

these past years, iain has worked closely with artists to build highly immersive works experimenting in fields from data collection, signal processing, responsive lighting, and biofeedback. since 2018 he produced visuals and lighting for over six events with phake.fun.

a sampling of technologies iain uses to make these experiences: MIDI, DMX, TouchDesigner, WebGL, OpenFrameworks, ESP32, Arduino/Wiring, WebSockets, OpenNI / Kinect, Lidar, Bluetooth, and Processing

interactive

2014

dome

in collaboration with special events committee
dmx lighting, sourcing, production

lighting design and productions for a geodesic dome side stage for springfest 2015

see install photo
2015

cube cubed

in collaboration with alex zhang
dmx, python, audio analysis, fabrication, led power and control

built a series of three cubes sized 16' 5' and 2' respectively, illuminated with computer-controlled LED and incandescent lights interactively with the concert nearby

view more...
2015

digital dust: the afterlife of forgotten images

web apis, webgl, three.js, street view api

digital dust is a webgl experience on both mobile and desktop that immerses the user in a 360-degree view of a distorted street but with a virtual layer of superimposed digital metadata from that location gathered from a combination of the instagram, flickr, and foursquare API

view project repo on github
2017

sleeping brain

in collaboration with hackathon team
opengl, touchdesigner, osc, gsgl, python

built a visualization of the brain during sleep with opengl shaders and touchdesigner

view installation overview
2018

messages

in collaboration with luke cheng, yun liao
json app scraping, jupyter notesbooks, json, xml template file generation

worked with artists to scrape and typeset thousands of pages of facebook messages

view installation overview
2018

antepyrosis & contrascientia

in collaboration with elektra jiang
arduino, audio synthesis, esp32, addressable led control, ultrasonic and motion feedback sensing

microcontroller system design and installation for live graphical animations created by elektra shown on oscilliscopes as images and waves for caldera

see code on github
2018

pulse

in collaboration with lenny zhu
midi, dmx, ir heart rate, arduino, synthesis

pulse is an immersive installation that mediates in-person connection and intimacy through a guided biofeedback experience with immersive spaces, lighting and synthesizing a partner’s heartbeat in a comfortable, small space creates an immersive personal meditation environment

view more...
2019

audio topologies

in collaboration with dmitri cherniak
websockets, serial, pwm, installation design, structural rigging

an installation for nycXdesign at colony. 120 computer-controlled fans interconnected in a grid in the ceiling responding to a custom soundtrack made for the exhibition. speed-controllable computer fans and a custom web and microcontroller wireless driver ran the installation.

more info
2019

#dateme off broadway

in collaboration with sam hines
websockets, expo, react native

built an mobile application with a team for #dateme off broadway working with a consultant to the show (+ designer and backend engineer). the application allows participants to create live dating profiles that are featured throughout the show (including a swiping interface, scripted bots that message participants with notifications throughout the show) and incorporating profiles made in the app as a part of the show

view more...
2019

spatial audio archologies performance

in collaboration with chengcheng zhao
lidar, web audio, performance

we created a piece virtually triggering audio in an empty space that told past histories of the space only through sound. we mapped the same spaces to the different sounds in our lives and premiered a performance going through the space that triggered our own sounds leading to a symphony of all the sounds together as the piece came to a close. participants were invited to explore the soundscape themselves afterwards.

view more...
2019

data shift: sensations

4 ch audio interface, 3.5mm headphone connectors, receipt printer, LED matrix displays, LCD display, bass acuator and amplifier

this piece is a reflection on how data exists and is transferred in systems and devices. experience binary data through tactile, audio, and visual interactions. load the interface on your device, disconnect from all networks, connect to our network through sound and compose a message to experience it in different ways. see your message make physical change and the space your virtual interactions take up in the world.

view more...
2020

handwash custom hardware

atmega, pcb design, circuit design, firmware

people have a hard time estimating 20 seconds, this device is a simple animated starfish with lights and a tune that animates for 20 seconds to wash your hands when you wave in front of it within 5-6 inches. it is battery powered and can mount next to your sink on the mirror or the wall (with simple removable adhesive). the pcb and circuit are custom built to be power efficient and safe.

view more...