I’m against dashboards. So I made DATASONICA.
A new way to experience climate data, not through graphs or dashboards, but through listening.
On November 15th, I premiered DATASONICA, an audiovisual installation that turns climate data into sound and moving particles.
The piece was shown at UNIT London (some photos of the event here) during Narrating Planetary Crisis: Art, Data, and the Language of Sustainability, part of the London Art and Climate Week, and it was commissioned by Future Botanic.
This article is the story of how I built it — and why I believe works like this are a more meaningful way to experience data than yet another dashboard (RANT!).
Why I Don’t Believe in Dashboards
If you follow my work, you know that I don’t like dashboards. Just to be clear, I made a lot of dashboard in my past work life. Until I didn’t like my job anymore.
But why I don’t believe in dashboards? Because, they reduce the experience into boxes, colours, and KPIs that tell you what is happening, but not how much is relevant for YOU. Dashboard makes data passive. Out of context. Boring.
Some dashboards (also) fail because they remove the human dimension of data, and that’s precisely what DATASONICA tries to bring back.
👉 Watch DATASONICA on YouTube.
The Starting Point: A Building Full of Sensors
The data came from Building 59 at the Lawrence Berkeley National Laboratory in San Francisco. I have been tinkering with the idea to make a construction speak while working on Everything Not Saved Will Be Lost, where I used the disappearance of sky as a metric to measure the changes in our skyline due to gentrification (more buildings → less visible sky → less light). In that project, I also wanted to find a way to save buildings from being demolished.
When I was in Bangkok this summer, I visited the beautiful Bangkok Kunstalle gallery. Before it became an art gallery, the site was the fire-damaged, abandoned Thai Wattana Panich Printing House, a massive brutalist complex in Bangkok’s Chinatown.
While wandering around, I noticed a gap between the two blocks of the building where you could see the sky. During storms, drops of rain fell through. To me, those drops were words the building was saying. This was a pivotal moment.
Fast forward to London. Apparently, by demonstrating an emotional connection between a building and the community, you can campaign against its demolition. So why not listen to a building’s micro-movements: its vibrations, its interactions with the surroundings, the people (and animals) that lived in it, and turn each of these data points into an instrument?
Step 1 — Understanding the Data by Listening, Not Plotting
The data for DATASONICA came in CSV files. Every few minutes, the building logged:
Air temperature
Relative humidity
Solar radiation
Dew point
And much more! Wi-fi strength, CO2 levels, noise levels, and more and more. I had all I needed in a spreadsheet, 96 rows for a full day and four metrics to visualise.
Datasets always look boring before you listen to them. So, for DATASONICA, instead of graphing the values, I started by thinking about how each variable behaves if the building was a sentient body:
Temperature represents the awakening, when the building becomes conscious → it becomes the lungs of the structure;
Humidity represents the presence and physicality of the building → it became the muscles of the structure;
Dew point represents the point of equilibrium between states → it became the pores of the building;
Solar radiation, the building is finally alive, triggered by the energy it absorbs → it became the pulse of the building, the late crescendo.
I don’t know how to call this, but I approached the dataset as if I were having a conversation with the building. Keeping that building in Bangkok always in my mind.
Step 2 — Data sonification and sculpting the sound
I’m not a musician, so apologise if this sounds odd. I converted the data into MIDI sequences. In the final edit with all the FX it sounds like this:
Each dataset became its own track, playing different instruments like a church organ, a piano, a violin, and a bell. It did sound choppy, so I added a constant drone synth base to glue everything together. Long reverb to connect notes. Slow panning to create spatial motion. I usually don’t dress the sounds, but here I made an exception.
Step 3 — Building the Visual System
The visuals were created in TouchDesigner. I built a set of audio-reactive patches that responded to the sound, colour-coded the data streams (Cyan → air/breath; Orange → sunlight/energy; etc), and let the sound trigger a particle system mapped onto a raster image of visualised acoustic data. In DATASONICA the visuals are not an explanation. They’re an extension of the sound.
You might not know this, but there’s an Easter Egg in all my works. In DATASONICA, too. If you spot it, you get a free copy of my upcoming book on Data Absence!
Step 4 — Seeing the Final Work in the Room
The final work for the event was a rendered video, but the system I built is meant to be played live. At UNIT, the installation finally came alive, on a big screen. It felt nice.
Not all data needs to be interpreted. Sometimes it can be just experienced. And I was glad people didn’t ask “what does this chart mean?” for once (yay!). They listened, which was the whole point.
DATASONICA is my alternative to dashboards: a sensory system where data informs perception rather than a KPI.

What’s Next: Live Data
This first version uses a single day of archived data.
The next phase will connect DATASONICA to a live environmental feed, turning it into a continuously updated, real-time sonification engine (or sound instrument). Ideally a plug&play framework that works for different buildings.
I’m looking for a space/office/building/gallery/warehouse/garage (even abandoned or empty) to test more sensors and create an immersive experience. Are you moving out and want to create a memory of your old place? DM me!
Workshops & Collaboration
I’m opening DATASONICA to:
institutions
studios
universities
galleries
musicians
…interested in workshops on data sonification, environmental and climate datasets, and audio-reactive systems.
If you want to explore another way of working with data, beyond dashboards and infographics, this project is a good place to start.
👉 Get in touch if you want to collaborate or bring a workshop to your team → hello@tizianaalocci.com
BUT
If you still want to make a traditional dashboard, or a report full of bar charts, just get get in touch with my studio NECESSITY.INK via email: hello@necessity.ink
Happy holidays, this newsletter will resume in Jan! :)
Till the next one!
Tiziana (Tiz)









Love the anthropomorphic mapping you did here. Treating temprature as lungs and humidity as muscles is exactly the kind of contextual framing dashboards strip away. I've been experimenting with sonified time-series data in industrial monitoring and the pattern recognition improvment is wild when you let auditory processing do the work. That Bangkok gallery moment with the raindrops is beautiful by the way.
Thank you so much!