We know how to rock out. Photos from the 1st annual BLITZ Jam are up now: http://t.co/RIGgkUh7oZ

posted 9 days ago on twitter

close
make it bounce!
make it bounce!

HTML5 Sound Visualization

Recently BLITZ was asked about modifying an audio driven flash site we completed a few years ago. In that time, technology on the internet driving rich consumer experiences has shifted from a plugin based platform dominated by Adobe Flash to a reliance on the native capabilities of the browser. This is significant, as it’s now obvious that mobile is the current trend in browsing. The old project utilized flash’s ability to extract sound data from a playing song and displayed an on-the-fly visualization of the data, similar to a visual EQ. This ability does not currently exist in most native JavaScript engines, though it is proposed. It got us thinking whether or not we could get this same effect running on a non-flash mobile browser. Currently IOS doesn’t support the ability to get this data from a sound file. After spending a couple hours thinking through the possibilities, we settled on a solution. We would preprocess the audio file, extracting the sound spectrum data and process that data in the browser.

The jist of the data-extraction process looks like this:

An Air application loads the mp3 file and lets the song play. As it plays, every 100 milliseconds it will grab the sound spectrum and select parts of it to use. Originally we were planning on using all 256 data points from the right channel, but a single 4-minute mp3 file was spitting out an 11 meg JSON file. Obviously this was a tad excessive… So after some tweaking we settled on grabbing 50 data points (every 5 from 0 to 250), which decreased the file size from 11 megs down to 2.5 megs. And trimming the decimal values down to only 3 decimal points (Thank You Nick Vincent for the idea) got it down to 800k, a size we were happy with. At each 100 millisecond marker, we grab the data, and inject it into an object. we use the time in miliseconds as the key (we had to round it to 100 ms) for the object. Once we have the object populated, we then serialize it to JSON and save it as a text file.

Once we had the data it was just a matter of implementation. Drop 50 dots in the DOM, each corresponding do the 50 data points of the sound spectrum, and move them around as the song plays. Implementation can vary, so we didn’t spend any real time cleaning things up for reuse, the purpose was more of a proof of concept. If we were going to get serious about it, we would write a wrapper JS file that you would pass an audio element, and the path to the JSON file. Then you’d hook up your events and let it handle the rest. If anyone has interest in taking this task on please let us know. We’d actually love to build it out time permitting.

You can view the demo here:

http://dino.blitzstaging.com/demos/soundspectrum/

Please note, the demo only works in safari, chrome and IOS (that was the challenge). You could easily rig this to work on firefox, just wasn’t a priority.

And you can get the source here:

https://github.com/dinopetrone/SoundSpectrum

The source has both the air application and a www folder containing the demo. To get the air application running you have to open in flash and build. I didn’t create an actual .air file. You will also have to run sass to compile the css (sorry its just how we do things here at BLITZ)

Coming from the flash world, I was very much sick of sound visualization. So please don’t take this and just create another sound visualizer… that’s lame. If you have any practical use please let me know. Also, one disclamer, all of this needs a good deal of polish. Again, it was just a POC spike.

If you’ve read all the way to the end of this article and are interested in doing work like this, we’re always looking for people to join the technology team. Check out our Careers Page for a list of open positions.

Leave a Comment