Skip to content
Arnolfini - est 1961


Live art, machine learning and artificial intelligence combine for a unique durational performance.

Kawasaki Plant explores the use of AI technologies in performance.  The work has emerged from an initial collaboration between Bean and sound engineer, modular synthesist & audio coder Cherif Hashizume, in which the artists worked to design & build a computer program that enabled Bean to live-edit pop songs with their body movements (a body).

For Kawasaki Plant, the artist will design & build new audio-visual programs based on machine learning principal primarily developed for advanced artificial intelligence that are responsive to their body, creating a soundscape & 360 projection environment. This work is focused on how the live body interacts with technology and how the artist can use technology to become bigger than their physical body. Using bodily variables of heat, sweat, speed & heartbeat to create & control audio-visuals; multiple sensory equipment will be employed to track and analyse the surroundings so the program can react to not only to the performer but the environment itself such as temperature, humidity, brightness etc. 

The artist is interested in power structures and the relationships between machine & creator in relation to historic & contemporary systems of patriarchy & capitalism. Kawasaki Plant was the site of the first robot to kill a human.

Suitable for all ages, parental guidance advised.

This is a durational work – audiences are free to come and go as they please.

Presented as part of Submerge, Bristol’s International Digital Arts Festival