ken

VR Laser Synth

Expressive VR Performance Instrument

Overview

Modern sound synthesis offers infinite sonic possibilities, yet digital instruments often lack the physical expressiveness of their acoustic counterparts. The Laser Synth is a Virtual Reality (VR) instrument developed as my Bachelor's capstone project and presented at the ICSC 2024 Csound Conference.

Unlike VR installations that rely on generative randomness, the Laser Synth is designed for intentional, traditional performance. Through spatial tracking, it creates an interactable interface that bridges human movement with the infinite possibilities of digital audio in a live performance context.

Instrument Control & Design

The instrument architecture revolves around three objects: a stationary central crystal and two hand-held orbs. The lasers which connect these objects represent the geometric relationship that drives the synthesis engine.

Pitch & Filter:

The distance between the Crystal and right orb controls pitch (quantized to 12-TET), while the distance between the Two Orbs modulates the filter cutoff frequency.

XY-Pad Rotation:

The left orb rotation functions as a 3D XY-pad, allowing for simultaneous manipulation of parameters like vibrato, reverb, and distortion.

Laser Synth UI Elements
Crystal and two held orbs, with the interface displaying real-time tuner and XY-pad mapping.

Technical Implementation

The Laser Synth leverages the CsoundUnity package to run Csound within Unity. This decoupling allows Unity to focus on spatial logic and visual feedback while Csound handles precise sound synthesis.

The Csound code was prototyped in Cabbage, and tactile VR interactions were implemented using the Auto Hand VR physics system.

Laser Synth Architecture
Tool Architecture and Data Flow
EngineUnity / VR
Audio EngineCsound
Key ToolsCabbage / Auto Hand

Key Outcomes

01

Natural Interaction

Proved that spatial distances in VR provide more intuitive 'theremin-like' control than traditional knobs and sliders.

02

Visual Reinforcement

Developed a HUD including a 3D chromatic tuner and dynamic laser color-coding for informative live performance feedback.

03

Academic Recognition

Successfully presented at the International Csound Conference (ICSC) 2024, highlighting the intersection of VR and professional-grade synthesis.

Challenges & Solutions

Problem 3D Pitch Accuracy

Performing melodies in invisible 3D space is difficult without haptic feedback or 'frets' to guide the hand.

Solution:

Implemented a 12-tone equal temperament quantization engine within the pipeline to provide a margin of error for the performer.

Problem Sound Design Testing

Iterating on Csound code required constant removal of the VR headset, which was cumbersome during development.

Solution:

Introduced in-VR dropdowns and sliders to adjust 10 key synthesis parameters in real-time without leaving the environment.

In-VR Sliders
Sliders to adjust 10 parameters within VR

Future Work