./ 2022

Systems’ Discourse

Fractals, mycelia, and the non-verbal languages of systems

Details

Tools
  • Reaper
  • VCVRack
  • Arduino
  • Processing
Team
Adél Szegedi

Creative Coder

Katiya Ma

Creative Technologist

Leonardo Mussatto

Sound Designer & Creative Technologist

./ introduction

Systems’ Discourse is an experimental installation that explores how pattern and communication structure complex systems - from fungal mycelia to human networks. Using real-time data, generative visuals and modular synthesis, the piece makes visitors an integral element of the installation, highlighting how even seemingly unrelate systems influence each other once they share a space. Systems are rarely - if ever - isolated. Communication is a central element of all systems - both natural, man-made and digital - and is often non-verbal.

The project was developed as part of the Convergent Media module at the University of Westminster and served as a practical testbed for negotiating legibility, mapping strategies and live audio-visual pipeline design.

In the final piece, four systems are intertwine and influence each other to create a contemplative environment:

  • a living mushroom, surrounded by sensors, sits on a plinth at the centre of the room

  • a slowly evolving generative pattern is projected on the walls

  • a rich, morphing soundscape fills the room

  • visitors, free to explore the space and sit on the benches lining the wall

./ development

The Living System

Sensor Exploration

Our aim was to integrate a “living systems” - aside visitors - into the installation, and focus on how it would communicate and be influenced by the other systems at play. While our early experimentation involved plants - easily accessible on site - we soon moved to mushrooms, as they often take a principal role in coordinating the growth of forests thanks to their ability to communicate.

Reading meaningful data from living organisms, however, is rather complex. Plants and mushroom do exchange vital information, but they do so through the release of chemicals and delicate changes in electrical charges. Consumer grade sensors are usually unable to pick up such fine changes. Therefore, we instead opted to track environmental factors as they can greatly influence the growth and state of the organism and are often the result of complex systems’ interplay.

We experimented with the sensors available in our university “emergent media” lab and chose the subset that best matched the conceptual and practical constraints of the project. The main candidates and decisions were:

  • Light Sensors (Grove TSL2561 / TMG39931)

    rejected: we intended to exhibit in a darkened room to preserve projector contrast, so light sensor had a limited significance

  • Air quality (Grove Multichannel Gas Sensor)

    rejected: tests showed the selected mushroom samples produced negligible measurable changes detectable by this sensor

  • Air temperature & humidity (DHT11)

    adopted: these are factors that would easily change during the span of the exhibition due to the presence of both visitors and projectors

  • Soil moisture probe (Grove Soil Moisture Sensor)

    adopted: this sensor provided meaningful and usable readings

  • Impedance Sensors

    rejected: impedance is influenced by the structure and integrity of the biological tissues, thus revealing nutrient deficiency, pathogen infection, and temperature stress for example. Connecting the Arduino board to the organism, however, is not enough to access meaningful data. This approach indeed requires performing proper Impedance Spectroscopy through specialized equipment and analysis

  • Ultrasonic distance sensors

    adopted: to track visitors interaction with the organism and the installation, we opted for simple ultrasonic distance sensors as they could be easily integrated in our pipeline and provided us enough data to determine visitors distance from the mushroom.

Closeup of the mushroom and the microprocessor boardCloseup of the mushroom and the connected sensors

./ development

The Sound System

Patch Exploration

If the living system focused on picking up the often unnoticed effect systems have on each other when they share a space, the sonic system instead revolved around self-similar patterns, a characteristic of most living systems operated by communication. Therefore, aside being generative and fractal like, this system had to be able to converse with the other systems.

We decided to experiment with a virtual modular synthesizer built around modules inspired by fractals, chaotic functions, and physical modelling. To produce organic modulation and timbral richness, the resulting patch has three principal voices:

To increase the sense of presence and immersion we experimented with sound spatialization. The five independent voices are thus streamed to Reaper where they are encoded into 3rd order Ambisonic and positioned in the virtual 3d stage using IEM StereoEncoders. Finally, they are merged and decoded to the custom 6 speaker layout through IEM AllRADecoder. By simulating the final result through binaural rendering - via IEM BinauralDecoder -, we were able to work on the soundscape before getting access to the location and available equipment, as well as to record multiple versions for documentation.

VCV Rack patchIEM Plug-Ins in Reaper

./ development

The visual system

Sketch Exploration

The aim was once again to create a generative system able to influence, react, and share information -i.e. communicate - with the other systems at play in the installation. Once we decided to use the mushroom as the centrepiece, the visual exploration took a clear direction: environment-sensitive particle systems echoing hyphal growth and ecological responsiveness.

We decided to work with Processing using p5.js due to its simplicity and the large amount of reference material available thanks to OpenProcessing. The resulting sketch implements a particle field advected by 3D Perlin noise seeded from a source at the canvas base. Each frame a new point is spawned and every point is translated by a direction vector computed from the noise value, leaving behind its trail. Particles therefore sprout from a single source and trace graceful, branching paths following noise gradients, analogous to mycelial branching and network formation where local conditions encourage proliferation. Occasional larger dots - affected by sensor inputs - punctuate the field and trigger the sound system, simulating the exchange of information in response to external factors. The piece thus slowly but visibly shifts as people interact with the installation space and micro-environmental conditions change.

Since we wanted visitors to step into the installation and become part of it, we decided to project the visuals all around the room. Enveloping them with the visual element proved an effective way to close the gap between the installation and its visitors. As we aimed to accommodate both playful and reflective interaction with the exhibition space, we decided to place the plinth holding the mushroom at the centre of the room and line the walls with simple benches. This gave us the opportunity to place projectors in a low, visible position - on the benches - deliberately revealing the technology supporting the visual system and creating yet another opportunity for visitors to affect the installation the multiple systems to intertwine. Similarly, we decided to expose sensors and wiring, and project the virtual modular synthesizer and Processing sketch and serial logs on the plinth to give visitors a peek into systems’ the inner working and underlying complexity.

Mycelia-like pattern generated by ProcessingDetail of the projected pattern

./ development

Pipeline

  1. Sensors → Arduino Uno → Serial over USB

    the Arduino board sent line-terminated ASCII strings containing numeric values tagged with identifier characters

  2. Processing → LoopMIDI → VCV Rack

    Processing received sensor data, applied custom range mappings, and then streamed information as MIDI

  3. VCVRack → Reaper → Speakers

    VCV audio channels were routed into Reaper via ReaRoute for ambisonic encoding and panning. Reaper then decoded the ambisonc soundscape to the custom speaker array and output sound to a Focusrite Scarlett 18i20 (v1)

  1. Processing Sketch → OBS → MadMapper

    Since we composed the Processing sketch using P5Js, we lost the ability to directly stream the canvas as texture to MadMapper through Spout. In turn, we gained the ability to easily access the sketch from a another device on the network, thus allowing us to offset projection mapping to a second device by employing OBS browser source and obs-spout2 plugin to stream the texture to MadMapper

  2. Processing Editor / VCVRack → OBS → MadMapper

    To map the synth and processing editor and console on the plinth in real-time, we used OBS to grab the windows and share them through NDI using obs-ndi - now DistroAV - to MadMapper on the second device

Image Gallery

./ outcome

Feedback

During assessment week, the installation was viewed by roughly ~30 visitors (peers and professors) and by a smaller number in subsequent days (~20). Visitors engaged with the installation first by tentatively walking around the space and approaching the mushroom, then - once they discovered their actions could affect the installation - by exploring the mapping through repeated approaches and retreats, and finally inviting companions to take other positions around the plinth. Others preferred to simply spend some time in the room, sitting on the benches lining the walls, suggesting the piece supported both solitary reflection and social interpretation.