Maria Kuptsova

INSIDE OUT

Inside Out is a music research project based on the synthesis of the performing arts of academically trained musicians with the study of the biological impulses of the human body in the process of playing music. Project investigates how does music interact with musicians, affecting their emotions, nervous system, physiological and biological chemistry? Are these biorhythms correlated with the musical composition in a synchronized sequention or we could discover unexpected dissonances? 

 

Various sensors connected to the musicians during the performance, as well as track the data from musical instruments at the moment of play. Information about heart rate, mechanical movement frequency, temperature and electrical activity of the brain is fed into the microcomputer, where the program converts the data into a visual component. In this form the information is transmitted to the projection screen presented to the viewer. During the performance, it can be visually observed how the musicians' physiology changes as they play their instruments, as well as how the overall musical composition affects them. Literally data from the sensors influences color of projected images and speed of its alteration. This is a unique phenomenon, because the image will differ at every moment of play based on specific emotional and physiological conditions of performing musicians. Thus, the visualization of each performance will have its own unique ornamentation. The viewers can observe how data from different musicians during a performance is decoded into the complex visual composition that reshapes their expectations. 

 

 

 

The InsideOut project was initiated in 2020 and has several iterations. In the first stage, the project participated in the international interdisciplinary art competition Goodmesh 2021, where it was among the top five works. The premiere took place in St. Petersburg in October 2021. The premiere of the second iteration of the project, which focused on the possibility of interaction between neural networks and brain wave activity data, took place in February 2022 at the Ground Solyanka gallery in Moscow and in March of the same year at the Ekaterininsky Assembly cultural center in St. Petersburg.

 
The second iteration of the project introduces three digital algorithms in the creation of the visual image: a pre-trained StyleGan2 ADA model of generative-adversarial network2, an algorithm that synchronized a sequence of visuals that GAN generates to music, and an adaptive algorithm that reads real-time data from the neurosensors, classy them and translates it into a reactive system of patterns, colour codes and noise dissonances that change the pre-trained image in real time.

 

 

To create a visual narrative we use large datasets of artistic works of different periods and styles, correlating them with the music of same epochs. Music of the early 20th century can be seen through the works of Expressionism, Rayonism and Poyntellism. Musical compositions of mid-20th century are expressedwith the works of the Futurists, the Post-Impressionists and the Surrealists. The second half of the century reveals abstraction and surrealism, reflecting on the works of Marc Rothko and Salvador Dalí.The end of the century presents the interpretations onPop Art, Minimalism, Feminism and Conceptual Art, Neo-Expressionism, ArtePovera, revealing interpretations on Jean Michel Basquiat, Roy Lichteinstein, Andy Warhol, and many others.

 

The music programme is based on the XXth century including pieces by Kluzner, Ysaÿe, Pärt, Desyatnikov, Sollima, Bartok and Yevlakhov. Every work that we have in the program has internal emotional contrasts. Such contrasts during performance are always difficult to perform, due to the fact that you need quickly to switch to another state.The piece played is unique each time, revealing the individuality and community of both musicians and listeners. During a performance we can watch live how the emotional background of the musicians changes as they play their instruments. The emotions of each musician influence the visuals, creating an overall visual composition. The musical combinations create visual coherence.

 

1 Alakus, Talha; Gonen, Murat; Turkoglu, Ibrahim  (2020), “Database for Emotion

Recognition System Based on EEG Signals and Various Computer Games - GAMEEMO”, Mendeley Data, V1, doi: 10.17632/b3pn4kwpmn.

2 Original code by Tatyana Zobnina based on Lucid Sonic Dreams and styleGAN NVIDIA Labs

https://www.wikiart.org/

 

 

InsideOut 1.0:

Performers: Natalia Uchitel (piano), Dmitry Stopichev (violin), Ilya Izmailov (cello).

Project team: Aizek, Marina Muzyka, Ilja Domnins, Egor Zvezdin, Maria Kuptsova

Producers: Artmix.me, with the support of ITMO University


InsideOut 2.0:
Performers: Natalia Uchitel, Ilya Izmaylov, Anita Ozheshkovskaya, Dmitry Stopichev
Project team: Morgan Korolkov, Tatiana Zobnina, Marina Muzyka, Ilya Domnins, Egor Zvezdin, Maria Kuptsova
Producer: ArtMix (Egor Zvezdin and Margo Bor)