Lumicles is a platform for visual and sonic experimentation that runs in the web browser. Through the synthesis of advanced rendering algorithms, spatial audio and virtual reality web APIs, the project proposes a deep immersion into living mathematical environments.

The following sections detail the fundamental pillars that make up this experience: from the power of GPU processing to the philosophy of generative art behind the project.

Generative Art and Emergent Phenomena

The essence of Lumicles is its generative nature, through pre-established algorithms (fields), it makes aesthetic decisions that escape the direct control of the author.

This process gives rise to emergent phenomena: situations where simple low-level mathematical rules, such as attraction and repulsion, interact with each other to create organised high-level complexity.

The use of algorithms such as Perlin Noise makes it possible to synthesise pseudo-random values, ensuring that the chaos of the particles always maintains an organic, fluid and fascinating appearance, guaranteeing that each session is a unique and unrepeatable piece.The user can force the simulation into an ordered state from which sometimes chaotic processes are triggered.

Generative Art

GPU Simulation and Rendering

To manage up to 4 million particles in real time, the software transfers the entire processing load to the GPU via WebGL.

The movement of each point of light is calculated using GLSL shaders, allowing a frame rate of up to 60 frames per second. The rendering pipeline includes a three-stage process specifically designed to enhance the digital aesthetic.

A 32-bit-per-channel render is used, then a Gaussian blur is applied and finally localized luminance corrections are applied to fix underexposed and overexposed areas.

GPU Simulation

Virtual Reality Experience (WebXR)

The project uses the WebXR standard, which allows Lumicles to be accessible from any Virtual Reality headset connected to the PC, eliminating the barriers of native software installation.

The VR experience, through the use of XR controllers, allows the user to navigate inside the simulations, change the colour palette and modify the simulation parameters through a VR menu that replicates the functionality of the web UI, but using a canvas element that is then converted to a GPU texture.

A special pipeline had to be implemented to render the stereo view in VR, since the Three.js library does not currently support applying post-processing filters in VR mode.

VR Experience

Sound System and 3D Audio

Sound in Lumicles is not a background track, but a stereo composition that adjusts dynamically.The main pieces are mp3 sound files that can be triggered by an event or played in a loop.

Each sound element can be positioned in 3D space by applying spatialisation filters provided by the WebAudio API (Panner Node). These filters allow simulating the directionality, distance and relative position of the sound with respect to the listener.

Finally an orchestrator coordinates the volumes, playback rhythms and detuning of each track, generating a specific sound scene for each movement dynamic.The system also takes into account the position of the user in the virtual space and the state of the particle simulation.

Sound System