Introduction
In 1932 Antonin Artaud published “Theatre of cruelty” manifesto. It was the first time when the phrase “virtual reality” was ever used. It described how Artaud thought about the new format of modern theatre. In Artaud’s manifesto there is a passage concerning musical instruments:
MUSICAL INSTRUMENTS: They will be treated as objects and as part of the set. Also, the need to act directly and profoundly upon the sensibility through the organs invites research, from the point of view of sound, into qualities and vibrations of absolutely new sounds, qualities which present-day musical instruments do not possess and which require the revival of ancient and forgotten instruments or the invention of new ones. Research is also required, apart from music, into instruments and appliances which, based upon special combinations or new alloys of metal, can attain a new range and compass, producing sounds or noises that are unbearably piercing.
It is unlikely that Artaud foresaw that advancements of technology will allow the creation of a new space, the metaverse, where his concepts will become intrinsic features. His vision resonates today as this new form of art is formulating its expressive language. In virtual reality, every part of the set can become a musical instrument. It can be an object or space itself, as the spatial arrangement of sounds is also a method of music notation in which one can move and in that way perform it. Virtual space can be shared by multiple users so ultimately it is possible to join in collective performances.
Dawn of the New Instruments
The year 1992 marks the beginning of the history of Virtual Reality Music Instruments (VRMI). It started when Jaron Lanier, CEO of VPL company – producer of early VR headsets, one of the founding fathers of Silicon Valley, presented his performance titled “Sound of One Hand” at the SIGGRAPH conference in Chicago. He was improvising on instruments built within Virtual Reality. Dressed in EyePhone headset and DataGlove – tactile hand interface – he manipulated his CyberSax, Rhythm Gimbal, and CyberXylo, producing eerie synthetic sounds. The audience could watch his POV projection on a large screen that was placed behind him on the stage. This historic event opened a discussion about the possibilities of performing without using physical music instruments, but instruments created entirely in virtual space.
After the downfall of VPL, VR had a period of almost 20 years of decline until 2013, until Oculus company released its development kit headset and virtual reality started regaining interest from coders and artists around the world.
During these years between Lanier’s era and Oculus, there were some interesting experiments with 3D space and music based on 2D projections. One of them is Tarik Barri’s “Versum” from 2009. It is an audiovisual environment for composing and performing. It’s operated and projected via a 2D screen, but it features audiovisual objects arranged in 3D space in which the user (composer, performer, or spectator) can float freely. This approach uses the space as a canvas for the composition, which can be explored by different paths so each run creates a meta-composition. There are no other means of interaction besides exploration.
Barri’s high-level coding skills allowed him to create a sophisticated bespoken environment, but a simpler way to gain creative access to metaverse was to use already existing tools – an approach that is rooted in the game modding scene. Since 2007 Avatar Metaverse Orchestra (AOM) is using the Second Life platform to perform collectively in the virtual world. AOM was founded by composers Hars Hefferman and Maximillian Nakamura, and through the years included such artists as Tina Pearson, Norman Lowrey, Pauline Oliveros, Jeremy Owen Turner, Bjorn Eriksson, Andreas Mueller, Miulew Takahe, Leif Inge, and others. Second Life allows users to create custom objects. Using that feature AOM members created their instruments such as Aviaphone and Onomatophone which can be operated via HUDs – interfaces displayed on the top of the Second Life application window.
Many of today’s metaverse environments such as VRChat, AltspaceVR, upcoming Facebook Horizon, or even Fortnite feature such modularity. It allows users to create their worlds. That relates to Lanier’s concept of phenotropic programming. Its base concept is to code while the code is running, and in virtual space, it means also to code it while being immersed in the environment you are coding. There are now many artistic groups that use existing virtual environments to collectively perform. It has one more advancement – active platform users are potentially the audience and some of these platforms have multi-million user bases around the world.
VRMIs Today
Lanier’s general concepts about VR are still valid. He pointed out that the most intriguing of VR is its engagement of the human sensorimotor loop. As a musician himself, he stated, that musical instruments are the greatest interfaces of all. Research on this topic is growing as we are witnessing the next generation of VR technology emerging in recent years with more accurate sensors, yet still far or just very different from actual reality. There are some interesting experiments in which VRMIs try to mimic well-known instruments such as piano or percussion, but lack of physical feedback makes them just mere imitations of their originals. The best results in this area were achieved by Robert Hamilton. In 2018 he presented Cortet, a VRMI that translates the mechanics of bow instruments into virtual reality. The characteristics of the sound response of Cortet provide aural feedback close to the actual string instruments. It features client-server architecture, so it allows multiple performers to play together in a shared VR space. In the piece called “Trois Machins de la Grâce Aimante” Cortet is used by four musicians as in the classical string quartet.
Hamilton’s previous VRMI work was Carillon. It was created with Chris Platz in 2015. Carillon is a networked immersive musical performance environment. It is controlled with gesture and motion by two performers immersed in VR with the accompaniment of the Stanford Laptop Orchestra. In this work, the VRMI is a big structure controlled from a distance by two performers. Gestures used by performers are more „magical” in contrary to the simulation of acoustic instruments as in Cortet.
In between these two directions – magical and mimetic emerged an idea for an instrument that I and my colleagues – Jakub Wróblewski and Andrei Isakov – created in 2019. That was Monad, an instrument that has quasi-organic form and that is visually responsive to the actions of the performer. In a premiere piece for the Monad, called Connexion, it was used to spherically pan sounds around the audience. The audience was surrounded by an eight-channel sound system and thus it was possible to immerse it in the sounds from every direction. Panning of the sound was directly correlated to the positions of the hand of the performer who was moving them around the Monad. Monad was also connected to the granular synthesis of the sound, so moving hands closer and further to the sphere was influencing LFOs and the length of the sound grains. The scenography of the performance was the same as the one used by Lanier in his premiere of „Sound of the One Hand” – there was a big screen behind the performer with his POV projection. Reception of Connexion was very positive, but during the talks, after the concert, some of the spectators expressed that they would prefer to watch the performance from inside the VR, along with the performer.
These examples show possible directions for the next VRMIs. They can be an environment filled with multiple responsive musical objects, that can produce sounds by themselves or as a result of interaction or external data streams. They can work via network, allowing multiple performers and spectators to join in collaborative VR performances. They can be modular and allow to build them while being immersed within. One environment that seems to merge those features is PatchXR. It is a spatial equivalent of visual programming engines such as Max/MSP, that allows users to build their audiovisual instruments from code blocks. Patches – the visual programs – are built by connecting different types of blocks with pipes and wires, which represent the flow of events. That makes them look like futuristic mechanism. PatchXR team is lead by Mélodie Mousset, Edo Fouilloux, and Christian Heinrichs, experienced artists and creative engineers from the VR industry. In summer 2020 they organized Patchathon – a game jam during which artists created their compositions using PatchXR. The works resulting from the patchathon were presented during the A MAZE festival in Berlin. I had the pleasure to be one of the creators participating in this event. During the week of intense work with the PatchXR developers, I had the opportunity y to delve into the capabilities of the system. An interesting fact was that each of the invited artists represented a different field of art. They were visual artists, scenographers, live coders, and composers. That resulted in very diverse works and it showed how syncretic VR is. My composition was based on the concept of Monad. I used sphere shape as a base concept for the controller which was then used for live performance. It was built with multiple slider objects in a form of a stick with a ring around it. These controllers were connected to different parameters of sound synthesis. By moving the controller’s rings I could manipulate e.g. LFOs, wave shapes, sequencer tempo, pitch, and volume.
VRMIs in Future
The technology of Virtual Reality has many limitations. One of them is a limited tactile response. On the other hand, the physical response of the objects is one of the most important elements of using musical instruments. If I press a key or pluck a string I can feel it. When I am striking a snare drum my drumstick recoils from it. If we are using our bodies to manipulate objects, this tactile response is necessary to gain full control over it. I think this is why VPL was using the glove as a controller – it possible to have it equipped with a tactile feedback interface. A secondary way of having such feedback would be the computer-brain interface that would generate such sensations and maybe that would allow designing the most amazing instruments, however, the question is if the transhumanist approach is even an option. I would not dare to answer this now.
Orchestras In The Metaverse
Artaud’s concepts about the organization of the performance space are surprisingly accurate to the way of how today’s virtual reality experiences are deployed.
THE STAGE–THE AUDITORIUM: We abolish the stage, and the auditorium and replace them by a single site, without partition or barrier of any kind, which will become the theater of the action. A direct communication will be re-established between the spectator and the spectacle, between the actor and the spectator, from the fact that the spectator, placed in the middle of the action, is engulfed and physically affected by it. This envelopment results, in part, from the very configuration of the room itself. Thus, abandoning the architecture of present-day theaters, we shall take some hangar or barn, which we shall have reconstructed according to processes which have culminated in the architecture of certain churches or holy places, and of certain temples in Tibet. In the interior of this construction special proportions of height and depth will prevail. The hall will be enclosed by four walls, without any kind of ornament, and the public will be seated in the middle of the room, on the ground floor, on mobile chairs which will allow them to follow the spectacle which will take place all around them. In effect, the absence of a stage in the usual sense of the word will provide for the deployment of the action in the four corners of the room.
Live music performances and festivals are one of the branches of the cultural world that are most affected by the coronavirus. On the other hand, this also catalyzed many experiments with using virtual worlds as a stage for music events. In 2020 Sci-Fi visions are getting materialized in the face of lockdowns due to the pandemic. In June 2020 Jean-Michel Jarre and VRrOOm had organized a live concert on the VRChat platform. It was an interesting experience and a test of the current state of the metaverse music concerts and the VRChat platform itself. There was huge interest from the audience, so VRrOOm engineers had to use multiple instances of the concert rooms as VRChat limits maximum participant number to 40 users per room. The caveat was that Jean-Michel Jarre’s avatar was present only in the first room which was reserved for official guests and the VRrOOm team. In other rooms, there was a flat-screen projection of the avatar which was a bit disappointing.
Anyway, it was pure chaos but in a good and uncanny sense. Strange avatars were roaming around (someone used a model of a large motorbus as his avatar), some of them were able to jump in restricted areas on the stage, blocking the projection of the artist’s avatar. Servers were lagging a lot and over-floating with new participants. I had to restart the application couple of times to rejoin the event as the framerates were dropping down to a single frame per second. Apart from the technical difficulties I was feeling a lot like on a live event, being surrounded by other people in their digital skins. The biggest issue was the lack of the artist’s presence and if he was present, he’d still not use Virtual Reality Music Instruments for his performance. The music was streamed from the standard concert setup into the VR world virtual speaker positioned in the middle of the stage. This event was watched by 2500 people inside VR, and 600K in YouTube streaming, which was a big success.
Earlier this year rapper Travis Scott has used the Fortnite game platform to have a series of concerts called Astronomical. In contrary to Jean-Michele Jarre’s performance, who had live tracked avatar, this was played back animation inside the gaming engine, so actually, the only true live element was the presence of the Fortnite users at the virtual venue. It has reached a record number of over 12 million viewers.
Another big gaming platform – Minecraft, was used to host Block by Blockwest a virtual music festival featuring music from groups like Grandson, Idles, Pussy Riot, The Wrecks, and Sir Sly. The festival took place in June and it used a stage build within the Minecraft game deployed on a server. The music performed live by the bands was streamed into the game. As the organizers say, the biggest success of this event was the community that was gathered which still communicates on the Discord server and awaits more events to join.
2020 Burning Man also took place in the multiverse. It used the AltspaceVR platform. Black Rock City was recreated in the virtual world with its art installations, temples, and performance sites. Again, the music performances were captured as a 2D video and streamed with sound into the virtual room, so there was no interaction with the audience and with any other virtual objects.
These examples show that organizing live music concert in virtual space is now a game of compromises. There is no dedicated platform for it and using game environments or social virtual reality platform comes with many challenges. It seems that the most successful in this area is still the veteran Avatar Orchestra Metaverse collective with their adoption of aged Second Life. They use virtual space effectively, with a wise choice of aesthetics and with great feeling for this new medium. I think that a combination of using VRMIs and streamed sound from acoustic or electronic instruments inside VR will ultimately give sound artists and musicians a full range of expression. Until now, no platform can handle this and still be able to handle numerous avatar audience.
One step towards having a realistic sound environment for live concerts is a model designed by Zylia company. It allows sampling sound from vast areas, like e.g. concert hall in a way that allows moving around in six degrees of freedom within the soundfield. With the use of a large microphone array, that can be spaced in an even or uneven manner, the sound is captured into an ambisonic format. Each microphone captures a single sphere around it, and in the system, these spheres are spaced in their respective positions. Walking within an acoustically captured sound hasn’t been done yet. If we could stream live concerts in this format this could change the way we are experiencing music performances via digital devices. Immersion is all about being present and such presence in the sonic environment is the most advanced way that we can achieve now.
Final Thoughts
We are on the eve of transformation from the social network to the metaverse as extended reality is getting ready to become the next computing platform. If the COVID-19 pandemic is the beginning of the years of isolation, this transition will happen even faster. For a virtual reality artist, this scenario is perversely a dream coming true as the audience equipped with their own devices will be successively growing. There will be a need for new repertoire and metaverse adaptations of the classics. Institutions will follow this trend as the stationary audience will be not allowed in full numbers into the concert halls, operas, and music clubs. Every respected music venue will have its metaverse counterpart. Bands will equip their studios with a virtual reality production pipeline and with VRMIs as well. The metaverse will step up in the music industry as the streaming services in former years. The analog, flesh-and-bone scene will still have its audience but it will be much more expensive to go to the concert that just log into it. Is this our inevitable future?
LINKS
Jaron Lanier: http://jaronlanier.com
Tarik Barri: http://tarikbarri.nl
Avatar Metaverse Orchestra: http://avatarorchestra.blogspot.com
Robert Hamilton: http://homepages.rpi.edu/~hamilr4
PatchXR: https://www.patchxr.com
Citations from Artaud come from: “The Theatre and Its Double” by Antonin Artaud
Translated from the French by Mary Caroline Richards, GROVE PRESS NEW YORK 1958
About the author
Przemysław Danowski – interdisciplinary artist, performer and composer based in Warsaw, Poland. Teaching and research assistant at Fryderyk Chopin University of Music at the Sound Engineering Department. Guest lecturer at Visual Narratives Laboratory in Film School Lodz and Fine Arts Academy in Warsaw. Spatial audio expert and designer of interactive audio. Member of INEXSISTENS creative collective along with Jakub Wroblewski and Andrei Isakov. His latest works include immersive music documentaries, interactive 6DoF VR experiences and immersive journalism.
For more details check:
Web | https://blog.przemekdanowski.com/
Polyphony | Wielogłos | http://fb.me/polyphonyVR
UMFC VR | wirtualna wystawa | http://fb.me/umfcvr
—–
In collaboration with Adam Mickiewicz Institute
Check also
👇👇👇