Iron Viz: the Power of Iteration

Posted by

For those who haven’t heard of it, Iron Viz is an annual data viz competition run by Tableau. It’s basically Iron Chef, but for data viz. Highly entertaining, and an amazing competition all around.

I’ve often thought of entering, but never followed through. This time, however, it was too much to pass up. Not only was the theme “music,” which is a true passion of mine, but I had just written up a music viz project on the very day the competition theme was announced! Kismet? Perhaps, but either way, I felt that I had to enter.

Now, I was pretty happy with what I came up with for part 1, but knew I had to be better. Iron viz entries are consistently awesome, and I didn’t want to look like I wasn’t taking it seriously.

“Ten edits”
Years ago, my dad (who’s a professor of industrial design) told me his theory of “ten edits.” He told me that if you really want to make something great, you should go through at least ten iterations. The first few edits are generally preliminary, just getting your ideas out there and having fun. When you get to edit 7 or 8, you’ll often think you’ve got something pretty hot, but don’t stop yet! If you go through the full ten iterations, you’ll come to something even better.

I’ve often thought about this, but this was the first time I explicitly did it all the way. I’ve always been a big believer in the power of iteration, but this was a great reminder of exactly how powerful it can be.

#Edit.1
This is already documented pretty thoroughly here (along with more background information on the data). The quick version is that the x-axis is time (from 15 to 25 seconds), the y-axis is frequency, and the color goes from quiet (blue) to loud (red):

Arrington de Dionyso Mani Malaikat

#Edit.2
My next idea was to adapt a technique I often use to visualize time series:

heatmap with bars

This is a great technique, but I won’t go into details here, as I don’t claim to have invented it, and it’s not what I want to talk about today (maybe in a future post…). I love this style of visualization because it lets you visualize many different aspects the data at the same time, giving a powerful sense of the data.

I decided to apply this concept to my music viz:

edit 2

Working from left to right, top to bottom… Top left is our meta-data. Top right is loudness over time, with time as the x-axis and loudness shown both via the y-axis and color. Bottom left is loudness by frequency, with frequency on the y-axis and loudness shown both via the x-axis and color. Bottom right is the same as discussed above, in #edit.1.

I really like this view, and was honestly tempted to stop right there! But I knew it still wasn’t good enough, even if it was quite interesting and beautiful (IMO at least).

#Edit.3
At this point, I went out for coffee with my frequent collaborator Jorge, and we had a great chat. He suggested I try to make my viz more like the experience of listening to music. With this idea rattling around in my head, I eventually came to my next iteration, adding animation:

edit 3

#Edit.4
At this point, I decided to get a bit more mathematical in my thinking. I started wondering if I could quantify characteristics of the various songs in any interesting ways. After much experimentation, I eventually came to a metric I called Smoothness, based on the standard deviation of loudness over time and across frequencies.

To be a bit more technical, I scaled the inverse of each song’s standard deviation so that the range was roughly 0.1 to 1.0.

Using this as a metric, I then dynamically sorted a simplified version of #edit.2:

edit 4

I also added the option to sort by various other metrics, but this was the most interesting one.

#Edit.5
This was one of my big breakthroughs. The problem I kept running into was that there were too many variables to plot simultaneously. In #edit.4 I realized that plotting loudness by frequency was more interesting than plotting it by time. My next jump was to decide to plot each moment in time on the “loudness by frequency” plot, instead of choosing an arbitrary aggregation. I did this by changing it into a scatter plot, instead of a bar or area chart:

edit 5

I was immediately struck by the beauty of this, as well as the added information given by spread of values over time.

#Edit.6
At this point, I started having some trouble thinking of new ideas. I discussed the whole process with my dad after I was done, and he said this is normal, and in fact the idea. We often get so attached to our initial idea that we get blinded to other interesting ideas. Part of the magic of the 10 edits is that you have to repeatedly push yourself outside of the box.

I started thinking about other ways to compare these songs on a more equitable basis. I noticed that some songs had high signal across the whole range of frequencies, while others had a more focused frequency pattern. What if we looked at the running sum of loudness, for each moment in time, for each song?

edit 6

Some of the songs are just plain louder than others, so I also broke this into deciles, showing the relative proportion of signal by frequency range, for each song.

#Edit.7
At this point, I took a moment to reflect, and looked back over what I’d done so far. I then came up with a new version, taking aspects from many of the previous edits:

edit 7

Animation? Check. Scatter plots instead of bar charts? Check. Heatmap with bars concept? Check. Pretty cool stuff.

And I promptly hit a brick wall of no more new ideas…

And I realized that animation doesn’t work on Tableau Public…

Again, talking with my dad afterwards, this is normal and all part of the process. But that didn’t make it any more fun! I was definitely starting to get disillusioned…

#Edit.8
At this point, I was a few days away from heading on a 8-day vacation (yay!), but wanted to get a few more iterations done before I left. As I was fresh out of ideas for incremental improvement, I started getting more experimental.

I started to wonder if I could devise a way to identify rhythms and other time-cycle patterns in the songs. I came up with a way of using Table Calculations to do loudness comparisons via the Lookback function. To save time, I parameterized it based on arbitrary (and sufficiently random) variables in the data. Enter the “K lookback:”

edit 8

Basically, the color shows the difference from the moment in time “k back,” with k increasing for each square, moving left to right, top to bottom. A simple but elegant way to start to detect patterns over time.

For those who are curious, a quick word about how this works. I basically made a variable that returned a K value via the MOD function, and used that to control a lookback function. I also made column and row variables (a la this post) based on k:

k

dif k back

col n row

#Edit.9
My vacation was fast approaching, and I was very much wondering whether I’d have anything worthwhile to turn in for the competition at all… Somewhat out of desperation, I turned to my old friend animation to increase the manic level of #edit.8:

edit 9

I’m not sure that this really tells me anything, but it fit my mood at the time. Fun, manic, nonsensical! And with that, I left to travel the world for a few days with my wonderful wife.

#Edit.10
If you’re still reading, we made it! We’re finally here, at the elusive 10th edit. While travelling with my wife, we talked about this viz quite a few times. She promptly told me that I was being to esoteric, and needed a better story. This was certainly true, and had been in the back of my mind for a while. After talking about the data and the competition, and what we thought was interesting about it, we came up with a different frame for the data.

Part of the problem was that I was looking at songs no one knows, and hoping the data would be beautiful enough to carry the viz. After much discussion, we decided that it would be more interesting to compare songs from musical rivalries. While this was a bit of a departure, it would not have been possible without #edits.1-9.

I added in data on lyrics, and eventually came up with the following:

edit 10

The spectral analysis part of the viz is much more stripped down than what I had been doing. I think it has an elegant simplicity that was made possible by my iterative distillation of the content.

I was quite happy with this end result, and getting there helped me learn about Tableau, R, and the creative process. You can see the full, interactive viz here. My final R code lives here.

Jeff

Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out /  Change )

Google photo

You are commenting using your Google account. Log Out /  Change )

Twitter picture

You are commenting using your Twitter account. Log Out /  Change )

Facebook photo

You are commenting using your Facebook account. Log Out /  Change )

Connecting to %s

This site uses Akismet to reduce spam. Learn how your comment data is processed.