ESP32 and accelerometers

This post focuses on setting up an ESP32 microcontroller with accelerometers for movement tracking. I will share the code and Max/MSP patch that I used to connect the device through WIFI. It will be a quick run-through of the material. I tend to research code examples online then edit it them for my purpose. My understanding develops as I figure out and refine the interactions that I want.  Hopefully this guide will provide you with the resources to get started in the same way.  In the future, I might break this down into smaller topics.

You will need a little knowledge in coding and circuits.  If microcontrollers such as Arduino are completely new to you, I would recommend exploring some easy projects first.  There are a number of tutorials online.  The Paul McWhorter ones found here were how I started. They help you to quickly help you get up to speed in the basics of creating simple circuits.  Learning from tutorials are also a great way to inspire new ideas for projects. 

What is an ESP32?

One of the key interactions of Turning Mvt into Sound was the use of ESP32 circuit boards. They are microcontrollers that have WIFI and Bluetooth enabled. Similar devices are Arduino and Raspberry PI, but there are many others. I used Arduino boards before, and they work the same way as the ESP32. The ESP32 are a little cheaper, which is why I chose them for this project. One of the things that surprised me was how stable they were on the WIFI.  We easily connected to the hire space WIFI at Chapter Arts Centre each day.  They were easy to use and not too bad to set up.

In the future, I want to connect them with Bluetooth. This is because I had a project where the WIFI was public, and I couldn't connect to it with them. It required a login and password through a public registration form. I couldn’t figure out how to code that example, though I’m sure it is possible. Bluetooth communication would avoid WIFI connection issues.

Documentation by: Alastair Gray

The Components

I bought these versions of the ESP32. I have heard that earlier versions have quirks when trying to upload code to them. This hasn’t been my experience.

The lithium batteries can be bought here. They plug into the device to charge.  They are a little bit bulky but they worked well and we didn’t have issues with running out of charge.  I’m sure there are more compact ones out there if that is a priority for you.   

These are the MPU 6050 chips.  They detect the rotation of the sensor.  This was an interesting change from tracking position, especially when it is attached to an arm.  You think about the movement of the arm, but when you are considering the input from the MPU, it is the rotation that you are really thinking about. 

 Using an Arduino or microcontroller means that you will need to have some ability to solder components or use a breadboard.  I had experience from GCSE technology in school, which was about 20 years before when I began this project so you don’t need to be an expert.  My results may make a professional recoil in disgust, but they worked and have not fallen apart for me. 

The soldering that you have to do is a bit fiddly but the connections are simple.  I wanted to use my chips just as motion controllers so first soldered the headers that come with the MPU 6050 to the board, then soldered wires onto the headers and straight onto my boards.  I have a feeling this is something else that professionals may not like but it worked for me.  The connections that you need to make are straightforward.

The pins that you connect are:

MPU 6050       

 VCC    ——> VCC

GND ——> GND

SCL. ——> SCL/IO21

SDA ——> SDA/IO22

Programming the ESP32

Arduino is written in C++. It may seem a bit daunting at first, but there are lots of tutorials and support online to guide you.  I have now created a number of projects with Arduino but wouldn’t say that I know C++ well, but I am confident in working alongside resources to make the projects that I set out to do.

The first thing that you need before getting the code to work is to install libraries. These add functionality to the ESP32 code. This part is slightly hazy in my memory as I write this as I tried out a lot of different libraries and ways of working before getting the complete project working. Because of this, I am including some external tutorials in library installation to help guide you.

The ESP32 library that I have got working is esp32 by Espressif, or esp32 by DF robot. This can be installed from Tools —> Manage Libraries. This means that you can select - FireBeetle 2 ESP32-E from the board selection. Then select your port. If you get an error - ‘exit 2’ you may need to change the upload speed to 115200.

The AdaFruitMPU6050 library means you can read data from the MPU6050. More detailed instructions can be found here.

The other libraries should be included with Arduino, these control the Wifi connection and OSC messages. These are what we will use to send data to Max/MSP.

After installing the libraries, copy and paste the code into your Arduino IDE.  The code that I used can be found here:

Changes that you need to make:

 ·      IP to send to.  If you are on a Mac, the quickest way to do this is to press alt on your keyboard and click the WIFI icon at the top right of your desktop.

·      The login details of your WIFI.  SSID – Username, then your password.

·      The local port is also important to note.  If you are using multiple ESP32s at once, change the numbers of each and make a note of them. 

 That should be the changes that you need.  Upload it to your chip.  While the ESP32 is still connected open up the serial viewer. It should hopefully say that you are connected correctly. It will also give you the IP address of your chip (This is different to your computer’s IP address). You will need the IP address in Max/MSP. 

The Max/MSP Patch

In Capturing Movement in Sound, I originally explored a separate gyroscope sensor than the MU_6050.  I switched to the MU_6050 which has both accelerometer and gyroscope sensor, I thought that it would be best to have both functionalities.  It turned out that the accelerometer was the most useful data for my use so I just used it, but this patch includes both so you can explore for your own use. 

I have attached a Max/MSP patch that visualises the data, here, as well as how to perform a simple smoothing of the data. The smoothing is very rough and ready and it is signal based so you will need to turn on the DSP to make it work.  In raw form, the data jumps about all over the place so this just smooths it out to make it a little bit more useable. 

When you open the patch, open the subpatch MPU6050_1. First, change the udpsend’s IP address to the ESP32’s IP, and the port number to the port that it is listening on. Then change the udpreceive to your laptop’s IP address and the port to the one that the ESP32 is sending data too.

With those changes, toggle the stream with a 1. If audio is on, the ESP32 data should be showing in the main patch. Use the sends to send the data to whatever it is that you want to control.

Hopefully this post gives an idea of how to setup the ESP32. It is more involved that I remember it being, but this hopefully gives you some idea of how to do it. iF you are lost or there is something unclear, give me a message/comment and I will look to amend the post.

The GameTrak Controller

As I mentioned in the previous post for Capturing Movement in Sound, the GameTrak Controller is a device that I return to often. It uses USB, works consistently and provides an expressive range of movement using 360º control of two joysticks with tethers. If you are going to start using alternative devices for music it gives you many options and is simple to start with.  You have the joysticks x and y controls but then you can pull out the tether to provide an additional z axis. It is a great device with theatricality built into it. Everyone who we showed the device to had a range of ideas for how it could be used.

Documentation by: Alastair Gray

The device was first introduced to me by SwanLorc as part of Ty Cerdd’s CoDi: Electronic. They showed us the patch that I still use today, linked below. The opportunity explored 6 different pieces written for SwanLorc and it was my first introduction to the possibilities of the device. Jenn Kirby, who was part of SwanLorc, at that time is someone to check out if you want to see new engaging ways to perform electronic music.

Connecting the GameTrak Controller to Max/MSP

If this post does not cover what you need to know, please leave a comment and i will try to answer it. I am presuming some knowledge with Max/MSP. The goal of this section is to help you connect the controller to Max and then I leave you to do whatever you would like with it.

Connecting the device is generally simple, or at least it was for my GameTrak controllers, and I have owned 3 to date. For other versions of the device, you may need to do a little bit of tinkering. Friends of mine have used the guidance from this site and a tiny little bit of soldering to get theirs to work. It all depends what version of GameTrak Controller that you are using.

When the device is plugged in, this patch receives the data in Max/MSP using the “hi” (human interface) object. This is the object that you use for connecting a variety of controllers to Max/MSP. Use the print object to see what data your device sends out and route it to whatever you want to do. In the patch above, this is hopefully done for you. I tend to run that as a separate patch, or else place it in a subpatcher and use sends to route the data where I want it to go.  It is important that the GameTrak is plugged in before you open the patch. The data that you get is from 0 to 4095 on each of the axis. It is up to you how you want to map the data. The data is also quite steady, there are not many erratic jumps or things to smooth out compared to other devices. It is as simple as that. 

Documentation by: Alastair Gray

 

Capturing Mvt into Sound: Exploring the Interactions: Zones

I chose the GameTrak controller as the first device to explore with Jodi. It has more presence than the accelerometers since the tethers provide a tactile sense. It’s dependability also meant that I knew it would be simple to setup. This meant that I could use part of the day to setup the ESP32 with the wifi which was more of an unknown.

The first patch that I prepared for Capturing Movement into Sounds was called Zones. I wrote some Javascript code in max to generate cubes randomly placed within the imagined 3D space of the x, y and z co-ordinates of the controller. When a performer’s hand passes through the zone it causes a tone to be played.  The tone stops with a slight delay when their hand leaves the zone. 

We started to generate different ways of performing with this device. Jodi wrote text instructions such as “Find a Silence”, “Find a low note, then create a rising melody” etc. These were nice ways to introduce someone into this interaction.

Documentation by: Alastair Gray

I see this patch as being useful for creating installation works. It is easy to pick up and it is easy to figure out, but the number of different instructions that we wrote for it, show us that there is a lot of ways to play with it. We also gave the audience a chance to play with it and they quickly picked it up showing that it would work with the public I would change the sonic output but probably keep a pitch element to it as it provides a nice way to musically play with the sounds.

Additionally we explored control over a granular synthesiser with the GameTrak controller. It worked well and there was a solid interaction, but it didn’t feel as strong as the triggering of the zones patch. An instrument with a constant sound has quite a different feel to it.  The timbre was changed by movement more than pitches.  This raises a question: is there enough musical movement to keep the soundworld interesting? We felt that there wasn’t in the iteration that we tried. The Zones patch had a nice sense of space and pauses built into the interaction. The random element meant that the physical positions of the pauses would be different each time which I found enjoyable to experience.

This is a quick overview and an example of one of the interactions that we created over the course of the workshops. For other resources to people using the GameTrak Controller can be found: here, here and here

Capturing movement in Sound R&D

In this post I will share my experiences with my recent Arts Council of Wales funded R&D: Capturing Movement in Sound. This is an overview on the type of devices that I used.  Over the next week, I will be describe technical aspects for people interested in trying out similar ideas.

Capturing Movement in Sound continued my explorations from Circling Above. The goal was to try out a new collaboration along with new methods for gestural interaction. My interest came from a musical perspective but also as a maker of digital music devices. This would be the first time creating a device for someone else to use. I was working alongside dancer/choreographer Jodi-Ann Nicholson.

We structured the project over three weeks. The first week was setup and introduced the devices to Jodi. The devices for me are experiential. I can describe what they do and what sounds they produce, but the interaction has a quality that you have to explore yourself. It is one of the things that really excites me about this technology. The second week was refining the instruments and interactions, then the third was working towards a private showing. During the process, we had visitors to gain other perspectives.  The instruments planned to use were the GameTrak Controller, Xbox Kinect and accelerometers using ESP32 microcontrollers.  The audio and visuals were programmed in Max/MSP. 

Documentation by: Alastair Gray

Overview of Instruments

The planned devices were the Xbox Kinect/webcam, GameTrak controller and an ESP32/Arduino based solution. Before the workshops, I designed interactions to try out with each of the instruments. I also pre-programmed the ESP32 controllers and soldered the accelerometer circuits. The goal was to have minimal programming during the workshop weeks. This meant that we could focus on the movement aspects of the project and not technical details.

 GameTrak Controller 

The GameTrak Controller is a device that I have used in many performances. It is an old PS2 controller that provides 360 degree tracking. They can normally be found on Ebay for around £20. It consists of two joysticks with tethers that stretch out of each to provide 360º control. There is also a foot-switch for additional control. The device connects via USB to your computer and from there you can map the data to whatever you would like.

The main advantage of the GameTrak Controller is that is it is dependable. You plug it in and it tends to work each time with minimal fuss. Because of this it is my backup device for any project that I work on. The negative is that it is not the nicest looking. It also has the tethers so they can restrict the movement of the performer. On the other-hand, you could incorporate the restriction into the concept of the piece. 

Documentation by: Alastair Gray

ESP32 + Accelerometer Control

The ESP32 is a microcontroller similar to an Arduino. The one that I chose was the Firebeetle ESP32 by DFRobot. Before the project, it was the device that I was the most excited about due to its WIFI and Bluetooth capability. My plan was to create a circuit with accelerometers and track the orientation of Jodi's gestures. They worked well during the R&D as there was private WIFI in Chapter Arts for the hire space.

Compared to the GameTrak Controller they need more technical knowledge. Especially in connecting them to WIFI or Bluetooth. First, you will need to make the circuit yourself. This requires the ability to solder. They you need to program in a version of C++ and upload them to the board. They then can send the data to whatever creative programming platform that you use.

Despite the learning curve, there lots of tutorials online. Following them, you can quickly build up the base of knowledge needed for a specific project. The benefits of learning to use them is that it feels like you can do anything with these devices. I love the fact that I am not dependent on any particular device or interaction. I can create a completely new device or interaction from scratch.  This is a strong reason why i would recommend exploring them. Potentially in a simpler project than this first.

Documentation by: Alastair Gray

Xbox Kinect/Webcam

One thing that I had to jettison before the project started was the Xbox Kinect. Since last I used the device, the operating system for my computer had been updated. I researched ways to make it work with Mac OSX Ventura 13 but it was too steep a hill to climb and there were other interactions to explore. It is something that I will come back to. I love the way that the tracking from an Xbox Kinect is wireless. There is a sense of magic to it.

Researching for this part of the project, I did discover openFrameworks which is a toolkit for C++. I am interested in exploring this for future projects as it could be possible to code my own skeleton tracking device. This would be a way to future proof my own projects. It is a downside with using other peoples externals. If they choose to discontinue them, it means that your piece can become out of date. Creating as much of your own material as possible means that you potentially avoid this.

My research also revealed that there are interesting develops in using AI tools for tracking. The only problem was that most of them were behind a paywall that was much too high for this project’s budget. I do feel skeleton tracking is a technology that will grow in influence in the future.

Conclusion

This has been the overview of the type of devices that I used and why. In the next couple of posts, I will go into more detail about the GameTrak Controller and ESP32.  They will be more technical and hopefully help people who are interested in using this technology a head start.  Then there will be a final post where I talk about the interactions and working alongside Jodi. 

Circling Above

In this post, I will explore why I chose to work with gestural capture devices. The influence they impose upon my composition workflow has had a profound effect on my music writing. My explorations with gestural devices began after my Masters. I was composing for contemporary classical ensembles with electronics. Up to that point, I had always scored pieces for musicians to play alongside fixed media parts. I loved working in the studio creating intricate chains of sound. I loved creating a score which was the exact instruction to play my music.  Composing an electro-acoustic piece felt like working with the perfect combination of the humanistic qualities of acoustic performance and the seemingly limitless soundworld of electronics.

It’s an old score. Please don’t judge me for the roughness.

Audio Block
Double-click here to upload or link to a .mp3. Learn more

 Over time, my frustrations grew with the presentation of the electronic parts. I attended electronic gigs/concerts where performers stared into laptop screens disengaged from the audience. It was disconnecting and dull. I desired to change the live performance of electronics. Additionally, fixed media imposes a rigid structure on performers in electro-acoustic music. The electronic part provides no freedom for the performer to deviate from the score. The tempo is rigid, so if a performer uses rubato for emotional effect, they drift out of sync. I hoped that in creating a performance focus on the electronics, I could address this issue at the same time.

My solution to my frustration was to use physical gesture capture in performance. Circling Above, for flute and electronics, was the first of these pieces. I used a first-generation Xbox Kinect feeding data into Max/MSP. The Max/MSP patch arranges the samples into groups based on the sample length. The performer’s right hand selects which group to pick a sample from. The height of the right hand also controls the sample's playback speed. The higher the hand, the faster the playback. If the hand is at chest height, it slows down the playback.  A low hand position plays the sample backwards. The different types of playback generate a multitude of sounds. When the performer places their right hand to the top left of the device’s vision, the patch selects a short sample size and plays it at high speed. A short staccato texture is created. Moving a hand along the centre of the performer’s chest generates layered drones. The instrument controls the texture and shape of the electronics, but not the exact material.

One-to-one mapping of parameters in electronic instruments can get boring quickly. If you were to generate a sine wave whose pitch is controlled by the height of the hand, it can be easily figured out by the audience. The performance loses a sense of mystery. Less transparent interactions create more mystique. The audience can sit in wonder how the sounds are created. They know there is a connection, but can’t figure it out. An extra element is added to the performance. Circling Above was the first piece where I loosened control over the exact sonic output of a composition. The electronics had random choice programmed into the instrument. Still thinking in my usual modes of composition, I traditionally scored the flute part. In a workshop with Rarescale, Carla Rees felt restricted by the score. The electronics were free, but not the performer. I redesigned the score by deconstructing the material into individual phrases.  The performer plays them in any order while listening and reacting to the electronics. The result is a much better dialogue between the two parts.

If you are interested in learning more about working with gestural devices, search through the NIME (new interfaces for musical expression) Archives here There you will find a wealth of papers on lots of aspects of electronic instrument creation. If this is a field you want to study, their mailing list is a useful resource to keep track of PHD studentships and research opportunities.

I am planning to write more blog posts that provide an insight into my explorations of these devices and how they influence my creative process. Designing an instrument is an interesting way to think of composition. The score begins with the instrument's creation, instead of at the page. My approach to following compositions and compositional process changed radically due to the experience of Circling Above. A whole new world of possibilities opened up. I discovered that it is not difficult to create gesture tracking electronic instruments. With a bit of googling, you can be up and running in an afternoon. It is where I began composing with performer freedom in mind. I started considering a whole new language for exploring musical ideas.  

Text Scores

I have performed many text-based scores over the past year.  Like with the open form that Fluxus provides, which I talked about in the last post.  Text scores provide that same freedom of form and content but more focused on musical output.  I am going to try and reflect on my past year of text scores which will hopefully provide a basic overview of some of the more successful pieces that I have engaged with in the past year.

 

The advantage that most appeals to me personally about the text score is that fact that, as a composer, you are presented with a completely blank canvas.  There are no restrictions on instrumentation, length, or action from one realisation to the next.  There is a complete freedom to the possibilities of performance.   In The Medium is the Massage, Marshall McLuhan proposed that the tools that you use end up using you.  There is definitely an aspect of this with music notation.  The five lines immediately imply pitch and certain ways of approaching music.  The text score ignores these.  You are free to do anything.  Composers have used this freedom in interesting ways.  Looking at Fluxus scores, another text based art medium, you can see this freedom taken to the extreme. 

 

In 1968, Stockhausen composed Aus den sieb Tagen (From the Seven Days), a collection of fifteen compositions for a variety of different ensembles.  The two that I have previously performed are Right Durations and Set Sail for the Sun.  Both are incredibly easy to perform and create interesting sonic results.


RICHTIGE DAUERN

(Right Durations)

 

Play a sound

Play it for so long

until you feel

that you should stop

 

Again play a sound

Play it for so long

until you feel

that you should stop

 

and so on

 

Stop

when you feel

that you should stop

 

But whether you play or stop:

keep listening to the others

 

At best play

when people are listening

 

Do not rehearse


One of the striking features of this score is that you are instructed not to rehearse the piece.  There is a focus upon one moment in time.  Stockhausen called the pieces from this collection as “intuitive music.”  The idea that the intuition of the performers is used rather than the intellect.  I have definitely found while performing this piece that you lose yourself in the sound-world.  I have used my keys as my sound generator in one performance and as the piece goes on, I find myself really exploring every sonic possibility that I can.  A really useful exercise to try and get the absolute most out of any sounding object.  The instruction, 'keep listening to others' is what I really love about this piece.  The act of paying attention to what the performer hears is at the centre of the performance.

 

This brings me to Pauline Oliveros, whose text scores demonstrate her Deep Listening philosophy.  Similar to John Cage's idea that we should listen to all of the sounds around us, Pauline Oliveros promoted an active listening to all of the sounds around us.  I discovered her electronic pieces' years ago, but it is her text scores with which I have engaged the most. The Riot Ensemble performed a selection of text scores at Cardiff University as part of a tour in 2017. It was an incredible mixture of pieces, all of which were created by text-based instruction.

 

The Oliveros piece that I have become most familiar with is The Tuning Mediation.


The Tuning Meditation (1971)

 

Begin by playing a pitch that you hear in your imagination.  After contributing your pitch, listen for another player's pitch and tune in unison to the pitch as exactly as possible.  Listen again and play a pitch that no one else is playing.  The duration of the pitches is determined by the duration of a comfortable breath or bow.  the dynamic level is soft through out the piece.  Brass players use mutes. 

 

Continue by alternating between the three options described below:

 

·      Playing a new pitch of your own that no one else is playing

·      Just listening

·      Tuning in unison to the pitch of another player

 

Introduce new pitches at will and tune to as many different players as are present.  Although the dynamic level is soft make your tones available to others.

 

Play warmly with variations in tone quality. 


The simplicity of the instructions means that no prior musical experience is neededIn newCELF events, we have included the audience in our renditions, and it has always created a fantastic communal experience.  Check out the recording from Unaccompanied here.  The simplicity of the instructions means that it is easy to explain, doesn't require any musical experience, and is non-intimidating to take part in.  It creates a communal singing atmosphere with a powerful sound-world.  We have found that even people who are adamant that they don't sing end up giving it a go after a few minutes of the piece.

 

The freeness of text scores is something that I love to explore.  It stretches the mind in terms of how to organise material within a piece and how performers react.  It can be used for any ensemble and create pieces that unfold differently every time they are performed.  I believe that every composer would gain a lot from performing and trying to write text scores at some point in their compositional explorations. 

 

If you are interested in reading further, check out Word Events: Perspectives on Verbal Notation by James Saunders and John Lely.  It is a valuable collection of interesting materials and scores.  There is a strong examination of what makes up a text score, with examination and examples of different types of scores.  It presents a useful collection of pieces that are not difficult to perform and worthwhile to experience.

 

FLUXUS

Fluxconcert-poster.3.jpg

I am in the beer garden of the Roath Park Pub wearing a suit that does not fit.  The trousers will not button, and as I reach down to pick items out of a baby bath, the sleeves rise high up my arms.  A pack of playing cards, a book, a box of matches, a toy train (including track), a pair of glasses, the most beautiful cracker and a cuddly toy are bobbing in the water.  The cuddly toy, face down, looks like the victim of a hit from a Mafioso kingpin.  I pull out the book, a copy of Moby Dick with pages torn out, and wring it like a sponge.  It is one of the more surreal visual images that has remained with me, but in the context of the evening, it was par for the course.

A concert has just finished featuring art pieces written by a myriad of different people who come together under the banner of Fluxus.  The event was organised Cardiff-based composer/artist Dan Wyn Jones and myself, with the aid of performers Rosey Brown, Ethan Davies, Lauren Heckler and Ingrid Lagouanelle.  We are pleased by the fact that we managed to fill the back room of the pub (which had graciously allowed us to use the space for free) with people interested in the obscure and absurd performances that are characteristic of Fluxus artworks. 

Fluxus is an international group of artists that originated in the early 1960s and had its main period of output from then until the late 70s.  Students from John Cage's experimental music course in New York began to organise concerts or happenings where they performed event scores.  These scores take the form of short texts that describe actions that a performer realises on stage, such as:  

Cheers

Conduct a large crowd of people to the house of a stranger.  Knock on the door.  When someone opens the door, the crowd applauds and cheers vigorously.  All depart silently.

Ken Friedman - 1965

Since its conception a multitude of scores have been created, ranging from those where you would be insane to realise, such as:

Music for a Revolution

Scoop out one of your eyes five years from now and do the same with the other eye five years later. 

Takehisa Kosugi - date unknown

to simple meditative actions:  

Lighting Piece

Light a match and watch it till it goes out. 

Yoko Ono - 1962

When this is performed, the combination of stillness and tension as the flame gets closer to the performer's fingers is quite mesmerising.  It shines a light on a simple action that has a subtle beauty to it.  This is the heart of the intention of Fluxus;  shining a spotlight on an aspect of the everyday that you would normally miss.  The origins can be found in the artistic technique of framing utilised by their teacher, John Cage, who was inspired by Marcel Duchamp.  Following the example of Marcel Duchamp's placement of a urinal in a gallery and framing it as art, John Cage framed the subtle noise of the concert hall in 4'33".  In turn the Fluxus artists took everyday actions and framed them on stage as performances:  the everyday and mundane became art.

George Manciunas is credited with naming the Fluxus movement.  Manciunas saw Fluxus as an 'anti-art' movement reacting to the elitism of the art world, demonstrated in the Fluxus manifesto.  The aims were to create art for everybody and that everybody could be included in the creation and performance.  The underlying principles are illustrated in our poster:

1.     Fluxus is an attitude.  It is not a movement or a style. 

2.     Fluxus is intermedia.  Fluxus creators like to see what happens when different media intersect.  They use found and everyday objects, sounds, images, and texts to create new combinations of objects, sounds, images, and texts.

3.     Fluxus works are simple.  The art is small, the texts are short and the performances are brief. 

4.     Fluxus is fun.  Humour has always been an important element in Fluxus. 

 Fluxus proved to be incredibly influential in the art world.  The term 'conceptual art' was coined by Fluxus member Henry Flynt to describe his pieces.  The focus of the idea over the aesthetic being an idea that has caused many a heated argument since its conception.  Video art stemmed from work created by Naim June Paik which was relevant at the time of television, but has become increasingly so with the saturation of video in our everyday lives and the ease in which videos can be created.  Examining the influence of even just these two examples reveals an incredibly rich vein of artistic expression through the art history of the last half-century.

For our recent concert, using the freely available Fluxus workbook, we organised a programme of works that included six new scores from each of the performers.  With an eye to keeping the Fluxus tradition of the everyday, we decided to have no applause at the beginning of the concert and intentionally did not signal the end.  The concert began with Ethan Davies' performance of Lee Heflin's First Performance. 

First Performance

Performer enters, bows, then exits.  This is executed once for every member of the audience. 

Lee Heflin - date unknown

The performance began as soon as the audience entered the performance space.  Once the audience had taken their seats, we collected Ethan and as an ensemble performed Shuffle by Alison Knowles.

Shuffle

The performer of performers shuffle into the performance area and away from it, above, behind, around or through the audience.  They perform as a group or solo: but quietly.

Alison Knowles 1961

There were no formal entrances or applause.  The next piece on the program served as a continuing thread throughout the concert

Variation #1 on Proposition

Make a soup.

Allison Knowles - 1964

This piece occasionally interrupted events as a new stage in the soup-making process was needed, culminating in the final piece:

Supper (Arr. for 4 performers)

The curtain is raised.  A large table is set with food, drink, flowers and candles is displayed on stage.  10 well dressed performers carrying instruments enter, bow, and seat themselves behind the table.  They lay down their instruments.  2 waiters begin to serve food and wine.  Performers begin to eat, drink and talk.  After a few minutes, the audience can also be offered food and drink. 

Ben Vautier - 1965

As this was the last piece, we ignored the audience and sat down to soup and wine, the final performance blending  with the rest of time outside the concert with no official end point.  The everyday actions elided into the audience's usual day.        

One of the most surprising and positive outcomes was the role of audience participation; two words that usually strike dread into attendees of performance art and a  fear that I have personally felt throughout performances  I have attended.  Over time, I have grown to enjoy to the inclusion of audience participation;  as that fear is a strong emotion that can create striking experiences.  By attending a live event you put yourself physically in the same space as a number of different people and the performers on the stage.  There is no longer the safety of a screen dividing you from the performers and their actions.  Surely a strong emotional response and the frisson of live rather than fixed performance are factors in why we attend such events.  In our concert, the unexpected outcome of audience participation provided something that made the evening more special.  The audience was included in this particular concert with interspersed performances of Ken Friedman's event score, Fluxus Instant Theater, throughout the evening:

Fluxus Instant Theater

Rescore Fluxus events for performance by the audience.  A conductor may conduct the audience-performers.

Ken Friedman - 1966

As Fluxus scores are generally practical to realise, we called audience members up to the stage, having taken their names and put them in a top hat as they entered, and then provided them with a card with a score upon it.  After a hesitant start, with the first person refusing to come to the stage, we soon found willing participants who, after the disappointment of being chosen to leave the safety of the group, enthusiastically performed each of the scores.  Interpretation is an element that is integral in the realisation of these pieces, so it was interesting to see how each person interpreted scores on the fly.  The most striking result of audience interaction came from Daniel Wyn Jones' score:

Appreciating Literature

Remove a page from a great literary work.  Read out one full sentence.  Give page to an audience member.  Repeat until every member of the audience has a page.

To be performed by several performers simultaneously.

Daniel Wyn Jones - 2018

The idea of this piece centres  on the sound of overlapping voices as each audience member receives an artefact from the performance in the form of a page extract from a  book.  An audible gasp was heard in reaction to the first rip of the page.  A line was read out and the page was handed to the first audience member.  One after another, audience members started to unexpectedly join the recital.  They cyclically read their page of Moby Dick while the performers continued to rip pages from the main copy, reciting a single line and providing another audience member with material.  The overlapping sounds of speech slowly grew and grew until the entire audience held a page in their hands.  A cacophonous roar of the words of Herman Melville filled the room, the lines clashing with each other.  The performers' job was done, but the piece continued.  The audience had taken control as we sat down and listened to a piece that was being performed by the audience to the performers -- a fantastic experience.  The piece was only starting to die down when set up for the next began.

There is an obvious connection between Fluxus and the Scratch Orchestra,  which I intend to explore later in a post connected to the recently formed: Cardiff New Music Collective.  Fluxus works do not necessarily focus upon sound, but they could be applied to compositions in interesting ways.  In our concert, we were surprised by the sonic result of Octet for Winds:

Octet for Winds (Arr. for Quartet)

Equal number of performers seat themselves opposite each other. A large pan of water is placed between the two groups and a toy sailboat is placed on the water. Performers blow their wind instruments at the sail of the boat pushing it to the opposing group. Both groups try to blow the boat away from themselves and toward the other group. If possible, all performers should play some popular tune while blowing on the sail. Piece ends when the boat reaches one end or the other of the pan.

George Brecht - 1964

BoatQuartet.jpg

 

Instead of woodwind instruments we arranged it for an ensemble of flute, clarinet, trumpet and french horn.   The interesting aspect of this piece is that it creates a strange, organised free improvisation session, with structure and tension built into the performers' situation.  The cacophonous roar  of the ensemble's playing was more interesting to listen to than I predicted, the chaotic nature of the improvisation contrasting with the slow bobbing motion of the paper boat's journey.  While playing in the piece, I became swept up in its competitive nature, even though the trumpet only minimally effected the movement of the boat.  I did notice that when we were in the lead, I started to make that final push in order to ensure victory -- a ridiculous idea but very much in the spirit of Fluxus. 

From the programming of the concert, to the expected and unexpected outcomes of the performances and the conversations with many people that I have had with many people regarding these pieces, this event was an incredibly enriching and thought provoking experience.  In a world where most performance is experienced through a screen,  embracing of the performative in a real space with real people doing unexpected and absurd things is a less usual and perhaps uncomfortable prospect, but also a rewarding one.  Long may the Fluxus spirit continue.  

Supper.jpg

* Photographs courtesy of Alastair Grey

Quartweet

I have had the privilege of taking part in a workshop with the Signum Quartet as part of their Quartweet project.  Find the score here or check out their or my twitter.