Virtual Reality Boids is a VR musical playground experience created in Unity and SuperCollider3. The player is able to interact in real-time with a world populated by "bird-oids", or boids, a type of artificial life program. As the boids flock and fly through the world, they collectively and continuously shape a lush soundscape of synthesizers.
More technically, the visual patterns generated by the boids translate to musical patterns through OSC routing and SuperCollider magic. The boids' positional data (and relative distance to the player) is captured in Unity then routed through OSC to SuperCollider in real-time, which alters synthesizer parameters.
The key themes for this project are rule-based composition, human-computer interaction, and computer networking.
Boids—short for “bird-oid”—are an an artificial life program that simulate the flocking behavior of birds. Boid flocking is an example of emergent behavior: complexity arises from the interaction between agents that follow a set of simple, clearly-defined rules. (The behavior of critters in John Conway's immortalized classic game, Game of Life, is another great example of emergent behavior.)
Boids must follow three rules (per Wikipedia):
separation: steer to avoid crowding local flockmates;
alignment: steer towards the average heading of local flockmates
cohesion: steer to move towards the average position (center of mass) of local flockmates
More on Boids:
- The original paper published by Craig W. Reynolds, the inventor of boids
- A Stanford paper about boids
http://www.vergenet.net/~conrad/boids/
We did not write the Unity & C# implementation for the boids themselves, but instead used Sebastian Lague's repository, found here. This was for a series of reasons:
- Programming this behavior turns out to be highly complex and requires very high-level knowledge of Unity. It felt out of scope of the project, as well as out of scope of our programming abilities in Unity (this was our first time developing anything with Unity, VR, and C#).
- We had other, more pressing technical priorities: 1) Implementing virtual reality using a Meta Quest 2 while integrating two separate platforms (SuperCollider and Unity); 2) implementing objects and scripts in Unity to fetch real-time positional data from the boids, including x, y, z coordinates and distance from the player; 2) routing this information to SuperCollider through OSC with minimal latency; and most importantly, 3) writing the SuperCollider code that would be able to receive data from OSC to create real-time, rule-based music that sounded convincing.
VR requires a series of packages that must be installed through Unity's package manager, as well as various Unity-specific implementation detail (e.g. creating a VR-plane for the player to walk on, a VR-camera object, etc.). Our project additionally made use of a lightweight, community-maintained OSC extension called uOSC, which was necessary to send data from Unity to OSC.
In SuperCollider, OSC is natively supported, and requires setting up functions (listeners). One major challenge was being able to send and receive this data with low-latency, since each boid is mapped to a synthesizer and is receiving update information 60 times a second. The fix that worked here was to 1) cast the positional data from the boids (floats) into integers before sending them through OSC; 2) rreduce the number of updates per second to the synths in SuperCollider; and 3) planning the composition part so that it reacted smoothly to this rate of updates (each synth updates its parameters every half a second) and magnitude of inputs (integers from -20 to 20).
- Open the "Boids Playground" through the Unity Hub, and connect a Meta Quest 2.
- Open "MusicalBoids.scd" in SuperCollider3.
- Evaluate the SuperCollider code.
- Run the Unity environment.
- Enjoy the VR Experience!