The movement of the coordinates making up the line segments are based on parametric equations. Designed and developed in Processing, and turned into a GIF using ImageMagick.
Though music production has been the core of my work, my passion resides at the intersection of technology and media. 3D graphics, physical controllers, and programming have been an essential part of development. This page showcases some of my initial explorations in this realm.
You may find more of my programming projects on my GitHub account.
Adaptive Art Technologies is an initiative that aims to develop accessible devices that bring novel approaches and opportunities to creative expression, particularly to people with disabilities. This project is being developed in collaboration with programmer and data scientist Robert Fischer, and more recently, entrepreneur and developer Álvaro Ramirez joined our team.
We are currently working on completing our first prototype, which consists of an audio synthesizer that can be operated through alternate controllers. The first one of these controllers is the SensorTile by STMicroelectronics, an inertial measurement unit (IMU) that contains an accelerometer, a gyroscope, and a magnetometer, as well as Bluetooth Low Energy (BLE) capabilities, among other components. The second controller is based on computer vision via OpenCV and Google's MediaPipe library. As such, we have given this prototype the name STCV-Synth, and the code we are writting can be found on GitHub.
The following videos demonstrate this first prototype in action, as well as a rundown through the available settings and controls.
Another controller, which will be developed in more detail in the near future, relies on EEG data captured using a commercial EEG headset. I previously collaborated with neuroscientist and data scientist Benyamin Meschede-Krasa to create a synthesizer controlled using a Muse EEG Headset. Together, Benyamin and I wrote custom Python software to map neural bands to specific frequency ranges that are then assigned to oscillators for audio generation. The code can be found on GitHub. Media demonstrations of our experiments will be added to this section in the coming weeks.
These two implementations will eventually come together to provide a unified interface and a more comprehensive set of controllers for creative interactivity via alternative inferfaces. Once the audio portion is refined and better established, I am looking to implement a video synthesizer that relies on these controllers using OpenGL.
Series of illustrations exploring multidirectional displacement. The scripts are designed to have four ellipses spread in different directions based on permutations of the given displacement values. The script will stop and save an image once all four ellipses coverge in the center.
Additional variations outline basic geometric shapes, based on explorations of geometric outlines and programmatic generative approaches.
Developed with Processing.
MusiCodex is an ongoing instructional project. My vision is for it to eventually turn into a resource that musicians and students can use to better understand the inner workings and technologies of music production. The project implements various instructional design approaches to offer hands on experiences, as well as literature to ensure a practical and theoretical understading.
The practical examples were designed using Max, by Cycling '74. I chose Max as the development platform as it permits any user to interact with the scripts without needing to purchase a license. Additionally, licensed users have the capacity to edit and repurpose the designed scripts.
The modules in markdown format and the Max patches may be found on the MusiCodex GitHub Repo
The audio compression module explains how dynamics processors work, as well as an overview of loudness as it relates to analog technologies. The Max compressor demonstrates signal interactions using a peak meter, an RMS meter, and a spectrogram.
Coded in C++ using OpenGL and SDL2, the 3D Sine Wave Visualizer offers an opportunity to interact with 3D sine waves via an event handler, which allows:
3D Sine Line Render - Side to Side
3D Sine Fill Render - Side to Side
3D Sine Points Render - Center Out
Incorporation of analog audio synthesizers (Moog Subsequent37 and Eurorack Synth), digital graphics (Processing 3), and Arduino boards and controllers. Cycling '74 Max 8 is used as the routing brain, and interfacing with the Eurorack Synth is achieved using the Monome Crow module, which converts data sent by Max 8 via USB to Control Voltage (CV) signals. The Arduino controllers modify various aspects of the musical and visual performance, including tempo, waveshapes and octave, position of the rendered ellipse, and colors. They do this by sending serial data to various locations via different ports, including Max, Processing, and the Moog Subsequent37 synthesizer.GitHub Repository
CPU renders using ray tracing based on Peter Shirley's "Ray Tracing in One Weekend" book.
Reference: Ray Tracing in One Weekend – Peter Shirley
Render of spheres demonstrating diffuse, metallic, and dielectric materials, as well as defocus blur.
Sphere with anti-aliasing.
Sphere with diffuse material.
Sphere with diffuse material and all-angle uniform scatter.
Spheres with shiny metallic material.
Spheres with fuzzy metallic material.
Sphere with refracting material.
Hollow Glass Sphere.
Distant View of Spheres with Multiple Materials.
Camera with Defocus Blur.
Simple demonstration of color iterpolation using four vertices of different colors.