MENTAL DANCE

Collaborators:
Carol Brown, Choreography
Marta Garrido, Neuroscientist
Jordine Cornish & Luigi Vescio, Dancers
Austin Haynes, Countertenor

 

Project Concept

The project begun in 2019 as an open-ended inquiry into how science can inform and be integrated into a creative process. Through various interviews and visits to the neuropsychiatry lab, thematic threads were teased out for further creative exploration. These included current research into neurodiverse cognition for Autism Spectrum Disorder and schizophrenia, and the research technologies used in the lab including EEG, MEG and fMRI. 

In contrast to the diagnostic language surrounding neurodiversity, we looked at the human impact of living with mental illness, such as that experienced by dancers Vaslav Nijinsky and Lucia Joyce. Although we started in the lab, we wanted to end up inside the body with all its mechanosensory neurons and limbic emotions.

 

The neuroscientific research into human cognition was used as conceptual reference for an interactive dance performance using wearable sensors. We were interested in the choreo-musical and cognitive aspects of the relationship between movement and sound where the movement creates and/or changes the sound, and how this resonated with neuroscientific concepts such as predictive coding and the Bayesian brain.

Repeated lockdowns in Melbourne meant that we could no longer collaborate in the same physical space, nor use wearable sensors such as accelerometers, gyroscopes, biophysical sensors or other devices such as infrared cameras, as the dancers did not have access to the required computer programmes or specialist hardware. To enable continued development, we had to harness ubiquitous technology that everyone could access. All collaborators were familiar with video conferencing apps such as Zoom, so using the video feed from Zoom for movement-tracking was identified as the easiest way to build the interactive system. The criterion was not accuracy, but accessibility and ease of use.

 

Using MediaPipe pose estimation technology to track dancers’ movements from webcam feeds, we enabled telematic rehearsals and performance of the work on Zoom where dancers in their own home environments sculpted sound in real-time. Constraints such as forced isolation, lack of access to technology and space to move were embraced to create a new type of collaborative performance where the screen becomes the stage and the interface between movement and sound.