Live coding is the act of turning a programming session into a performance. This can constitute improvisation, music, visuals, poetry, hardware, robots, dance, textiles and people. Pretty much anything with an input and output can be controlled live by programming.
This is not just a performance by programmers for programmers. While this is often where it starts as a live coder, the type of audience and the accessibility of the performance lies in the performers imagination. Abstraction can get us pretty much anywhere.
1
|
|
Repl Electric
Repl Electric is a project I started in order to discover more about music composition and Artificial intelligent based aids to creativity. Which in turn through the inspiration of people like Meta-ex lead me to live programming music.
Here is a performance live coding music and graphics, inspired by a performance in London:
The Stars
Open Live Coding
All the tools and code used to create this performance are open for all to see on Github: https://github.com/repl-electric
Three programming languages were used to create this piece:
- Clojure (Sound)
- GLSL (Visuals)
- Emacs Lisp (Animations & Navigation)
Tools
Here are the tools used and a little detail around how they where used in performing “The Stars”:
Clojure: http://clojure.org
Clojure is a LISP language based on the JVM.
Clojure focuses on interactive REPL (Read, Evaluate, Print & Loop) driven development. Which makes it a good choice for interactively coding music. It also turns out functional programming is a good fit for operating over music as data.
Emacs Live: https://github.com/overtone/emacs-live
Emacs live is a Emacs release with packages and defaults that are Live Coding centric. Something I use for both for my work and for my live coding.
To execute our code, we launch a repl instance in our project (NEVER launch inside emacs, since then if emacs crashes the repl and music dies) and connect to it from emacs using cider
https://github.com/clojure-emacs/cider.
A simple trick to combine Emacs code and visualizations is to launch an OpenGL window in full screen (see Shadertone) and then put a full screen transparent terminal window running emacs over it.
The tumbling text effect seen at the end of the performance is an emacs animation using Zone Mode
which supports writing your own text destructors: http://www.emacswiki.org/emacs/ZoneMode
Overtone: https://github.com/overtone/overtone
Overtone is a Clojure based client to SuperCollider. Supercollider is an environment for real time audio synthesis and algorithmic composition.
Overtone provides us with:
- Timing (beat generation – example timing code).
- Building Synths (engineer sound).
- Running samples (both your own and from Freesound).
- Live Synth control (changing notes, durations, reverb, etc).
- Hardware interaction (through midi or OSC).
An example of a synth used in The Stars:
1 2 3 4 5 6 7 8 9 10 |
|
Timing
Timing is a complicated issue but so important its worth touching on. You have a choice with Overtone to use Java for timing or Supercollider. I use Supercollider since I have found it to be much more reliable. Everything you need is here (copy and paste), thanks to the hard work of Sam Aaron.
The key concept to take away is there are two types of timing, a beat counter which is forever incrementing and a beat trigger which flips back and forth between 1/0.
1 2 3 4 5 6 7 |
|
The counter is useful for indexing buffers, the trigger is useful in controlling the gate of an envelope (which turns a sound on or off).
In Clojure we can still get access to the beat, in our timing code we send a message using send-trig
on every beat. We can hook a Clojure function to callback on this beat:
1 2 3 4 5 6 |
|
I use this extensively to time graphic transitions with the music.
Buffers
Most of my live coding performance was writing to buffers which are hooked into synths. Buffers are just fixed size arrays but they are stored in Supercollider rather than in Clojure. Here is an example from The Stars where the midi notes are read from a buffer at a rate based on my beat timing signal (a 16th of the main beat here).
1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 |
|
GLSL + Shadertone: https://github.com/overtone/shadertone
Shaders generate imagery directly on your Graphics Processing Unit rather than going through your CPU. Through a language called GLSL (which is C like) we can express very simple functions which get called on every single pixel generating complex visuals. Here is a simple extract from The Stars that generates all the background small dots:
1 2 3 4 5 6 7 8 9 10 11 12 |
|
For more examples of whats possible with Shaders checkout Shader Toy
Shadertone is the Clojure library that provides a convenient way of running shaders from Clojure and for feeding in data about our synths. It provides access in your Shader to:
- Overtone’s Volume (
iOvertoneVolume
) - The frequency spectrum & audio waveform data (Passed as a 2D texture
:textures [:overtone-audio]
)
To synchronize the graphics with the music I created a special Overtone synth which does not generate any sound, it instead feeds information in realtime to my shader.
1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 |
|
Inside our shader code:
1 2 3 4 |
|
The other main way of controlling a shader from Clojure is using atoms
.
1 2 3 4 5 6 7 |
|
Hardware: Monome: http://monome.org
Something you don’t see in the video is that I’m using a 8x16 Monome. For this performance its primary function was a visual aid to show the beat/measure information.
The hardware is driven by Clojure communicating with the Monome through a serial port: https://github.com/josephwilk/monome-serial/tree/protocols
Live Coding
Live coding music and graphics combines skills in sound engineering, 3d graphics, geometry, physics, musical theory, composition, improvisation & hardware to name a few.
It is difficult, and requires a lot of work and practice.
But of all the code I’ve written over the years this is one of the things I’m most proud of. And i’m only at the beginning of discovering what’s possible.