The particles are subjected to force fields whose parameters can be modified in realtime during the simulation. These fields are described in terms of mathematical & logical functions in a GLSL program. Various types of noise functions are used to generate randomness and disorder.
How it works
It uses de information stored on textures maps (on GPU memory) to represent the position XYZ of each particle and update it using a pixel shader. This allows it to simulate up to 4 million particles in realtime, very efficiently.
The user can fly freely inside the cloud using different types of cameras. The simulation can be freezed at any time and ultra-high resolution screenshots can be downloaded, with 16 bits precision per RGB channel.
Emergence & Randomness
The final result exposes the concept of emergence where a set of simple low level rules, create high level complexity. In this case, the motion of each particle is controlled by simple programs. Nevertheless, as the simulation develops, patterns and complex structures emerge.
The simulation outcome depends on the initial conditions, how the user modifies the parameters through time and other random factors which lead to a virtually infinite number of scenes.
The audio subsystem
The final soundtrack is generated in real-time combining multiple predefined MP3 audio files using the Web Audio API (Howler.js library). Some tracks are triggered on specific events. Other tracks play in continiously.
Many types of audio filters are applied to the sound tracks, based on the user position/orientation in 3D space and the parameters of the simulation. Additionaly, a spatial audio filter is applied to some tracks to simulated sound sources located in the virtual space.
The sound experience is highly dependent on the user path through the 3D cloud of particles
Virtual Reality (WebXR)
The software can also be experienced in Virtual Reality through the WebXR API. Using an Oculus Rift headset, the user can dive into the particle clouds and control the 3D simulation in real-time, using a XBOX gamepad controller.
The experience has a beginner's mode where the user can shuffle through hundreds of presets, that create unique scenes, just by pressing a button on the controller. There colors of the particles are set from a list of 100 predefined gradients that can be controlled separately.
In collaboration with the artist Esteban Gonzalez, who designed and composed the audios for the different simulation scenes, we presented Lumicles VR at the 2018 National Art & Technology Contest organized by the National Arts Fund of Argentina.
The project was selected among other 25 projects and was on exhibition at Kirchner Cultural Center (CCK) in Buenos Aires, from September to November of 2019. Some images of the installation are shown in the following gallery.
Trailer for Art & Technology 2018
Trailer for Lumen Prize 2019
More videos can be found at this Youtube channel
Federico Marino is a software engineer and visual artist, who lives in Buenos Aires. He works as an independent software developer and he also teaches Computer Graphics at the Faculty of Engineering at the University of Buenos Aires.
He is specialized in the development of 3D applications. He is interested in exploring new ways to combine art and technology. At his blog www.fedemarino.com.ar, many of the projects, explore the use of software to create generative systems, interactive 3D experiences, etc.