Design and Development

Parametric Equations GIF

The movement of the coordinates making up the line segments are based on parametric equations. Designed and developed in Processing, and turned into a GIF using ImageMagick.

Though music production has been the core of my work, my passion resides at the intersection of technology and media. 3D graphics, physical controllers, and programming have been an essential part of my creative process. This page showcases some selected projects I have developed in this field.

You may find more of these projects on my GitHub.

Projects


Adaptive Art Technologies


Adaptive Art Technologies is an initiative that aims to develop accessible devices that bring novel approaches and opportunities to creative expression, particularly to people with disabilities.

iOS App – GitHub repo

Currently, the main initiative of Adaptive Art Technologies is the development of an iOS application entitled AAT Synth. The design is based on a modular approach that keep the media engines separate from the input controllers. This facilitates multiple approaches for creative expression. The main idea is to craft intuitive and accessible mechanisms to interact with the system, opening the doors for everyone to create a meaningful musical experience. The project implements an audio synthesis engine, a motion controller that relies on the built-in sensors of the iOS device, and a computer vision controller that relies on feature detection through the device's camera. Future efforts aim to expand this project by including a video synthesis engine.


The icon of the application was designed in collaboration with Elif Kavuşturan.

AAT Synth Overview

This video is the final presentation of my capstone project. It includes a description of the project as well as a demo of the synth in action.
Dec, 2023

The project is available as a TestFlight implementation that is being used for quality assurance and customer validation. If you're interested in trying out the this AAT prototype, you may reach me at david@sound-architect.com.

Python Prototypes

A couple of instrument prototypes have been developed over the years that focus on working with alternative controllers to control an audio synthesizer engine. One of these instruments, the STCV-Synth, controls the synth engine via a wearable devices and computer vision algorithms. The second one of these instruments, Neuromusic, controls the synth engine using an EEG headset.

STCV-Synth – GitHub repo

The STCV-Synth consists of an audio synthesizer that can be controlled using the SensorTile by STMicroelectronics, a wearable device that contains various sensors, including an inertial measurement unit (IMU) equipped with an accelerometer, a gyroscope, and a magnetometer. The SensorTile can transmit sensor data via Bluetooth Low Energy (BLE), making it an efficient and powerful wireless device.

The second controller is based on computer vision via OpenCV and Google's MediaPipe library. Features are detected from a camera data stream, and the coordinates of where these features in the screen are obtained are analyzed for collision against GUI elements. This collision detection approach allows physical gestures to serve as a remote control for synthesizer settings.

The following videos demonstrate some of these prototypes in action, as well as a rundown of the available settings and controls.

STCV-Synth – Rundown

This video demonstrates the various functions of the STCV-Synth, included the parameters mapped to the SensorTile, and the various GUI controllers operated via computer vision.
Dec, 2021
STCV-Synth – Demo

This demonstration exposes the STCV-Synth in a performative environment, where I am playing the guitar, and my dear friend and colleague Álvaro Morales is playing our first prototype.
Dec, 2021
Neuromusic – GitHub repo

Neuromusic utilizes a commercial-level Muse EEG headset to capture neural data. Using a combination of FFTs and AI, this data is processed into variable data streams, each corresponding to specific neural bands in independent frequency ranges. These bands are assigned to oscillators and processor parameters for synthesizer control and music generation.

This project is being developed with Neuroscientist and data scientist Benyamin Meschede-Krasa. Media demonstrations of our experiments will be added to this section in the coming weeks.

Future Plans

The modular approach in the design process aims to create a unified interface to connect a comprehensive set of alternative controllers to drive synthesizer engines. This process will begin by solidifying robust implementations of the wearable device and computer vision controllers, as well as improving the quality and resolution of the EEG controller. In terms of the generators, the audio engine will continue to be refined and optimized. Additionally, a video synthesizer will be developed using graphics frameworks, starting with OpenGL and slowly transitioning to Metal and Vulkan.

Other Contributors


DisOrgan


DisOrgan Installation - Main View

Developed in 2023, DisOrgan is an interactive installation that celebrates the impact of disability in shaping the landscape of innovation. Many everyday objects have their origins in disability, given that designers and inventors are constantly seeking solutions to support and enhance the lives of people they love, not to mention how communities of people with disabilities are at the forefront of innovative solutions that make the world more inclusive.

In collaboration with a team of talented artists, activists, and entrepreneurs, we featured this installation as part of the Sound Scene Festival, presented at the Smithsonian Hirshhorn Museum and Sculpture Garden. In particular, I contributed to this project as an installation designer and composer.


The DisOrgan installation consisted of two interactive musical instruments, the Cart and the Pipes. The code for these instruments can be found on GitHub.

The Cart

The Cart is a self contained musical instrument that relies on six infrared proximity sensors to control a Max patch. This patch deployed the composition, which was based upon random playback of samples of everyday objects that have their root in disability.

DisOrgan Installation - The Cart

The Pipes

The Pipes are a collection of simple cylindrical elements that play a single musical note. Each cylinder has a single ultrasonic proximity sensor that controls the loudness of its corresponding note. The cylinders are intended to be played by multiple people, revealing a hidden chord and implying that we can only hear the music collectively. The sound generation is achieved using a Daisy Pod, manufactured by Electro-Smith.

The pipes were developed in collaboration with artist and technologist Alexander Wu.

DisOrgan Installation - The Pipes

After the completion of the 2023 Sound Scene Festival, I continued to contribute to the festival as a member of the 2024 Creative Advisory Committee.


Displaced Colors


Series of illustrations exploring multidirectional displacement. The scripts are designed to have four ellipses spread in different directions based on permutations of the given displacement values. The script will stop and save an image once all four ellipses converge in the center.

Additional variations outline basic geometric shapes, based on explorations of geometric outlines and programmatic generative approaches.

Developed with Processing.

Line Equation
Line Equation - 1000 Lines Sketch 1
Line Equation - 500 Lines Sketch 2
Line Equation - 250 Lines Sketch 3
Line Equation - 125 Lines Sketch 4
Line Equation - 62 Lines Sketch 5
Line Equation - 28 Lines Sketch 6
X = 3 and Y = 2
x3 y2 Sketch 1
x3 y2 Sketch 2
x3 y2 Sketch 3
X = 5 and Y = 4
x5 y4 Sketch 1
x5 y4 Sketch 2
x5 y4 Sketch 3
X = 7 and Y = 3
x7 y3 Sketch 1
x7 y3 Sketch 2
x7 y3 Sketch 3
X = 3 and Y = 1
x3 y1 Sketch 1
x3 y1 Sketch 2
x3 y1 Sketch 3
X = 7 and Y = 4
x7 y4 Sketch 1
x7 y4 Sketch 2
x7 y4 Sketch 3
Square Pattern
Square Pattern Sketch 1
Square Pattern Sketch 2
Square Pattern Sketch 3
Grid
Grid Sketch 1
Grid Sketch 2
Grid Sketch 3

MusiCodex


MusiCodex is an ongoing instructional project. My vision is for it to eventually turn into a resource that musicians and students can use to better understand the inner workings and technologies of music production. The project implements various instructional design approaches to offer hands on experiences, as well as literature to ensure a practical and theoretical understanding.

The practical examples were designed using Max, by Cycling '74. I chose Max as the development platform as it permits any user to interact with the scripts without needing to purchase a license. Additionally, licensed users have the capacity to edit and repurpose the designed scripts.

The modules in markdown format and the Max patches may be found on the MusiCodex GitHub Repo

Compression
MusiCodex Compressor

The audio compression module explains how dynamics processors work, as well as an overview of loudness as it relates to analog technologies. The Max compressor demonstrates signal interactions using a peak meter, an RMS meter, and a spectrogram.


3D Sine Wave Visualizer



Coded in C++ using OpenGL and SDL2, the 3D Sine Wave Visualizer offers an opportunity to interact with 3D sine waves via an event handler, which allows:

  • Controlling the wave's amplitude, wavenumber, and wave period.
  • Move the camera in space.
  • Toggle OpenGL render mode between points and triangle strip.
  • Apply post-processing effects using a frame buffer.
  • Enable wireframe mode for the geometry and for the frame buffer.

GitHub Repository
Dec, 2020
Programmer  |  Designer
3D Sine - 1

3D Sine Line Render - Side to Side

3D Sine - 2

3D Sine Fill Render - Side to Side

3D Sine - 1

3D Sine Points Render - Center Out


Relationships between Images, Sound, and Physical Controllers



Incorporation of analog audio synthesizers (Moog Subsequent37 and Eurorack Synth), digital graphics (Processing 3), and Arduino boards and controllers. Cycling '74 Max 8 is used as the routing brain, and interfacing with the Eurorack Synth is achieved using the Monome Crow module, which converts data sent by Max 8 via USB to Control Voltage (CV) signals. The Arduino controllers modify various aspects of the musical and visual performance, including tempo, waveshapes and octave, position of the rendered ellipse, and colors. They do this by sending serial data to various locations via different ports, including Max, Processing, and the Moog Subsequent37 synthesizer.

GitHub Repository
May, 2020
Composer  |  Sound Designer  |  Programmer  |  Designer


Back to top