“Show us your screens”.
A definitive slogan for Algorave community dedicated to creative coding. A hacker/DIY culture of open-minded individuals who like to create music with code. Everything you hear and see is the outcome of the algorithms. The collective joy of having fun together, openness, and inclusiveness are the important aspects of the Algorave community. Enthusiasts use open-source live coding environments like TidalCycles, SuperCollider, Gibber, Sonic Pi to mention few.
Algorave, the algorithmic computer-based (live) coding is relatively new. The term itself started as a joke. British academics and musicians Alex McLean and Nick Collins were on their way to a rave. They were to play their computer-generated music in late 2011. In the car on the way to Nottingham, that term Algorave was coined.
Started mostly in galleries and academic seminars, algoravers are in big festival lineups now. Dedicated events where people dance to music generated from algorithms are not rare anymore. Even though artists experimented with algorithmic music in the past, live coding based performance is relatively new.
“Computer languages allow us fine-tuned expression. First, we mold a unique sound environment in code; then, human movement transforms and guides the musical processes in real-time: programmer and process unite through sound.”Alex McLean and Adrian Ward, Slub
To be clear, Algorave isn’t just live coding. It should just be music that’s generated using computer algorithms that the performers have created or modified by themselves. It is not restricted to any genre nor bpm as it is a method. And it is a great possibility to create radically different music that we have never experienced before.
It was 2013 when an Algorave event took place in Sydney. And one of many practitioners of live algorithmic codes, Tokyo based Renick Bell, joined his first event.
In order to understand the context better, we need to dive a little deeper. Algorave didn’t take the dance floor by storm. It took decades until enthusiasts would be able to perform live algorithmic music. Computer performance of music was born in 1957 when an IBM 704 played a 17-second composition. The pioneer of computer music programming is Max Mathews. From the late 50s, he had been pushing the progress of computer-based music. His notable work is “MUSIC”, the first computer program to be widely used for sound generation.
In 1961, using updated version “MUSIC IV”, Mathews produced “A Bicycle Built for Two.” A digitally synthesized version of Harry Dacre’s “Daisy Bell.” If you are a science fiction fan, you should definitely remember it. In 2001: A Space Odyssey (1968), computer HAL 900 gives a memorable rendering of this song.
MUSIC was an extraordinary achievement, laying the ground for upcoming programming languages. However, for live coding practices, computing powers were not there yet. It would take several decades for technologies to catch up.
Fast forward to 80s: the era of consumer-friendly computers. Ron Kuivila at STEIM in 1985 did the firs live coding performance and presented “Watersurface”. He did the piece in a programming language called the Forth Music Language, which he developed together with David Anderson.
And the final important component – The Internet. It connected DIY communities and enthusiasts who could share papers and open source programs. It was possible to find like-minded friends and share ideas. Interconnected data transformed the real world.
“I started doing algorithmic music when I was in university back in 1995. At that time it wasn’t real-time. You had to write code, compile it and then you could listen to it. And if it wasn’t right then you had to go back to the code and redo the whole process again”.
Bell is a computer musician and programmer. He has an interdisciplinary Bachelor’s degree from Texas Tech University and a Master’s degree in music technology from Indiana University. He is a teacher and a graduate of the doctoral program at Tama Art University in Tokyo, Japan. His current research interests are live coding, improvisation, and algorithmic composition using open-source software. He is the author of Conductive, a library for live coding in the Haskell programming language. He is the initiator of Algorave events in Hong Kong and the first events in Japan where he currently lives.
Starting 1999, Bell gradually began to abandon Windows OS to switch to Linux, leaving it permanently in 2005. As a computer musician, he was looking for a proper coding environment. It was around that time when he discovered live coding practitioner David Griffiths. He was a member of “Slub”. An algorave group formed in 2000 by Adrian Ward and Alex McLean, joined by Dave Griffiths in 2005 and Alexandra Cardenas in 2017.
“I used to play in hardcore punk bands doing really aggressive music, and it was live. When I got into computer music, I could do a lot of things in the studio or through processing sound on the computer, but I still wanted to do live music because I like the atmosphere of live performances.
Later I was making drum and bass and more abstract electronic music. Drum and bass is a somewhat rigid dance music genre. I made a lot of these tracks and sent them to record companies with the hopes of putting out records. All of my efforts failed in the end, though. At the same time, I was putting out my electronic music on my own net label, which I run with Jason Landers.”
Bell started studying SuperCollider and programmed some applications that produced generative music. However, the interface could only be managed with the mouse and it was not satisfactory for him as a tool. The problem was GUI [Graphical User Interface], so he began to develop a language with a better interface. During development, he stumbled upon an article called “Hacking Perl in Nightclubs” by Alex McLean. It was innovative, fresh, and very inspirational. Bell realized that with terminal and a text editor he could make music almost as well as if he used a GUI.
“I understood that creating a musical interface involved writing much more code than playing directly with the code itself: this allowed me to concentrate better on music. I simply didn’t need the graphical part of the interface. For me, the physical gesture is more limited in expressively, than manipulation of symbols representing abstractions. While humans have learned to use a complex vocabulary of gestures to produce art, the real-time manipulation of text-based symbols may increase the range of what is expressible.
So I started doing live coding. The development time was much shorter. Furthermore, I began to understand that the use of symbols, phrases, and abstract thought, instead of gestures, was an approach with many potentials.”
In 2019, Bell was selected by Aphex Twin for The Warehouse Project in Manchester, UK. where he headlined the Archive Room stage.
“When I play, I have running processes that perform tasks automatically, which I don’t control. I set them up, start them up and then they work without my direct supervision. I can go back and reset them. But otherwise, they work independently. This is the reason my library is called “Conductive”. Like if you conduct an orchestra of processes inside your computer.”
A live coding event is like a surgery session. You see all the guts. It teaches the artist to be aware of the process, it promotes transparency. The audience is there to listen and watch what the artist is doing. Patterns that artists follow are endless and the reasoning behind them is very individual.
“I do often follow a pattern of a 130 bpm section followed by a 160 bpm section. Both are broken by ambient breaks. The system, if left alone to run, will not put the breaks in the same places as I do. It also doesn’t have any awareness of the audience. So it won’t make decisions to restrain drum patterns to increase repetition and therefore maybe encourage more dancing. Nor will it make things more intense in response to shouts from the audience.
It will also sometimes make a series of fast changes that I can’t do myself. And maybe it will make those changes more repeatedly over a period than I will allow if I’m controlling the system alone.
I’ve made the design of the system, so anything it does is a result of what I’ve designed it to do. We can’t give it too much of an appearance of the agency. To say it will do so without me just means for the system to follow the rules that I’ve given it. Rules that are more dynamic in some ways than what I do when triggering changes by hand.
However, the performance is still a result of my design and interaction, just sometimes being controlled at a greater distance in time.”
A notable collaboration with New Zealand-born producer Fis (Oliver Peryman) and Renick Bell’s started in 2017 with a commissioned performance for Berlin Atonal. Since then Bell and Fis have released a performance and recording research project called Emergence. 18 track album explored separate recording sessions since 2017. Released biodigitally in partnership with Eden Reforestation Projects, “Emergence Vol. 1” will fund a minimum of 100 trees per sale.
“When things go wrong, you get a very interesting effect with the audience. Everything goes silent, you see the performer panicking, their mouse is shooting around the code scrolling up and down like, ‘What did I do?’ Then everyone starts looking at the screen and trying to help. I’ve been in performances where people will scream out, ‘You’ve missed a semicolon!’ ”NeboGeo (Dave Griffiths)
“Algoravers often like to build things, to craft things, but also to share those things with others so that others get the chance to experience the fun parts of those creations or improve them. They also enjoy breaking things!
I also really think algoravers want to dance, and to find new ways to dance, and to dance with others, even if they are sometimes too shy to do so under the glow of the projectors.
I hope that our communities become close enough and reassuring enough that people will lose their shyness. Even if they don’t want to dance, they are perfectly acceptable to just study projected code and enjoy the sounds. That tolerant aspect of algoraves is one of its strengths.”
In his performance “Live coding + text” for “Telling Stories” Festival, Bell works with text messages. Parallel to generative music he explains his system and practice and how the openness of live coding is a response to our increasingly algorithmically-mediated society. In this work, text evolves from technical to the contextual symbols.
“The text appears as part of the interface to my musical performance. I relinquish direct control to the agents in my system and gain more freedom to talk with the audience while performing.”
Try to have a diverse lineup, thinking about e.g. gender, ethnicity, class, age, belief/non-belief, and education. A diverse lineup creates a diverse audience which leads to a diverse community and is beneficial to all. Additionally, it is also nice to see a range of different technologies and approaches to keep things moving.from Algorave guidelines
Many aspects of our lives are becoming algorithmically mediated. Our everyday choices are carefully analyzed and calculated by algorithms. Food, shopping, film networks. We are the augmented citizens of the future, but how deep do our artistic reflections go beyond consumption rituals?
Algorave live coding community is a glimpse at Utopian society. The way they ignore conventional and mainstream approach is essential to that form. Composing at the moment and sharing fun. It is “Failure OK” culture, where the audience cheers you up if something goes wrong. It is some form of digital punk. Remember the 1997 cyberpunk film “Nirvana” by Gabriele Salvatores?
[Openness against bias, horizontal relationships against corporate greed!]
For more info visit
Renick Bell: www.renickbell.net
Algorave community website: www.algorave.com
If you find this article interesting consider checking What Is Synthetic Media