My journey started with the first funk song ever recorded. On a hunch, I imported “Cold Sweat” into Magic Music Visuals and the patterns that were generated looked like quilts from Gee’s Bend, Alabama and Kuba textiles from West/Central Africa. This was when I realized that cultural production could be algorithmic… a unique kind of maker grammar.
Magic Music Visuals is music visualization software that I’ve used in the past to VJ live, outdoor events. The sound-generated images were mapped and projected on walls and buildings in Taos, New Mexico and Boston, MA. DJs provided the music and I sat nearby with my laptop (running Magic).
This week, I was challenged to think of ways to develop projects using 2020 U.S. Census data (see above) on self-response rates in Massachusetts. I recalled a computer science class I taught where students who majored in music and composed songs based on data; visual arts students translated 2D bar charts into 3D models using Tinkercad and 3D printed them.
I discovered TwoTone, a web-based app that converts data into sounds. TwoTone allows users without any music or technical background to create sounds out of their datasets to better understand them. For converting numerical values to audio, TwoTone uses a Musical Scale, whereby higher values correspond to higher pitches. On the far left is the “Data Source” or variable and the corresponding instrument is on the right. Variables include:
- DRRINT = Daily Self-Response Rate — Internet (Piano)
- DRRALL = Daily Self-Response Rate — Overall (Double Bass)
- CRRINT = Cumulative Self-Response Rate — Overall (Church Organ)
People can become a data DJs by layering and mixing tracks (variables). They can add their voices by narrating in between or during tracks and make that into a story. I immediately thought of a data-driven hip-hop cypher with a DJ and rapper providing the words and music, then using the mix-down (MP3 file) to create a three-dimensional representation of the production.
To create the 3D model, I imported the mix-down (MP3 file) into Magic and created a waveform effect. I used the kaleidoscope to create a circular design. Then, I inverted the waveform and cleaned it up using Inkscape. I imported the SVG file into Tinkercad and created the model.
A next step might be to 3D print this model or go back in with Magic and create a fully realized (complex) visualization. I’m thinking about what this might look like as an interactive object. For now, I will create how-to guides and link these stages to learning strands such as ISTE’s Computational Thinker standard:
In 2017, I learned that marine scientists in Little Cayman were inserting sensors into invasive lionfish to track their movements. I suggested taking the lionfish data and doing something similar to what I did with the Census data so that Cayman students could visually see what was happening when the fish moved further and further out in the ocean. Unfortunately, we never got around to trying that but maybe in the near future.