iain nash

iain’s interest in interactive media started when he worked as a researcher with the mobile and Environmental media lab on intelligent meeting room and lighting design. He worked with behanz fahari on living, breathing walls and various pieces as a technical consultant.

iain has used various embedded development platforms such as arduinos, esp-32 chips, raspberry pi, and addressable led chipsets of all kinds to create interactive, modular IoT solutions for home and installations

at usc, iain integrated his interest in film photography process and the digital screen and backlight by creating photograms using the internals of an LCD panel and laptop keyboard backlights

these past years, iain has worked closely with artists to build highly immersive works experimenting in fields from data collection, signal processing, responsive lighting, and biofeedback. in 2018 he produced visuals for over six events with phake.fun.

a sampling of technologies iain uses to make these experiences: MIDI, DMX, TouchDesigner, WebGL, OpenFrameworks, ESP32, Arduino/Wiring, WebSockets, OpenNI / Kinect, Lidar, Bluetooth, and Processing

interactive

2014

dome

in collaboration with special events committee
dmx lighting, sourcing, production

lighting design and productions for a geodesic dome side stage for springfest 2015

see install photo
2015

cube cubed

in collaboration with alex zhang
dmx, python, audio analysis, fabrication, led power and control

built a series of three cubes sized 16' 5' and 2' respectively, illuminated with computer-controlled LED and incandescent lights interactively with the concert nearby

see install photo
2015

digital dust: the afterlife of forgotten images

web apis, webgl, three.js, street view api

digital dust is a webgl experience on both mobile and desktop that immerses the user in a 360-degree view of a distorted street but with a virtual layer of superimposed digital metadata from that location gathered from a combination of the instagram, flickr, and foursquare API

view project repo on github
2017

sleeping brain

in collaboration with hackathon team
opengl, touchdesigner, osc, gsgl, python

built a visualization of the brain during sleep with opengl shaders and touchdesigner

view installation overview
2018

messages

in collaboration with luke cheng, yun liao
json app scraping, jupyter notesbooks, json, xml template file generation

worked with artists to scrape and typeset thousands of pages of facebook messages

view installation overview
2018

antepyrosis & contrascientia

in collaboration with elektra jiang
arduino, audio synthesis, esp32, addressable led control, ultrasonic and motion feedback sensing

microcontroller system design and installation for live graphical animations created by elektra shown on oscilliscopes as images and waves for caldera

see code on github
2018

pulse

in collaboration with lenny zhu
midi, dmx, ir heart rate, arduino, synthesis

built technical system for an immersive biofeedback experience

view project repo on github