Multimedia Art Museum, Moscow | Exhibitions | Yulia Vergazova, Nikolay Ulyanov - Svan/Nectar
.

Yulia Vergazova, Nikolay Ulyanov
Svan/Nectar

Moscow, 21.12.2021—22.05.2022

exhibition is over

Share with friends

For the press

Based on scientific research in the field of how plants perceive audio and video content, the authors created a film intended primarily for hearing and vision.

Plants are known to respond to the sounds of pollinating insects by increasing the concentration of sugar in their nectar. The film reproduces the sound range of a single honey bee hovering with peak frequency of 200-500 Hz, to which plants are most responsive.

Caucasian honey bees are highly rated by beekeepers all over the world, so the authors chose microtonal Svan songs as the soundtrack for the film: Elia Lrde, Kriste Agsdga and Dale Kojas. The first melody is a nonsensical hymn based on syllables and vowels. The second is an Easter hymn. The third is a ballad about the fatal golden-haired goddess Dal, who lives in the highlands of Georgia (mainly in Svaneti). The features of these songs (3 voices, 4 verses, an algorithmically built grid of notes based on the original chords) are reproduced and played on samples of buzzing and pollination sounds.

The video sequence generated in accordance with the music consists of output data from the BigGAN neural network – the first neural network to generate images from a wide variety of domains chosen from 1000 classes (subclasses of the ImageNet dataset). The authors selected the type of images related to bees and the plants they pollinate. There are 9 in total: a bee, chamomile, yellow lady’s slipper, rosehip, rapeseed, orange, lemon, strawberry and cucumber.

To adapt the image to the ‘vision’ of plants, the artists used the optical flow principle – a machine-learning method for processing video content. The results of this work resemble the colour spectrum of a photographic lamp, and the principle of such vision is a speculative version of the bee’s vision. For example, only areas in motion or temperature maps are visible. Based on video documentation of how bees pollinate plants, the author calculated the optical fluxes, compiled a dataset of 330 images and trained the pix2pix neural network, which reinterpreted the originally generated video from BigGAN and brought it as close as possible to an option suitable for a plant cinema.