HW 6 - Audifly, a Music Visualization Web App
In this assignment you'll be extending Audifly, our very own music visualizers for both MIDI and mp3 formatted music, though you'll only be making an mp3 renderer. To start, download the Audifly NetBeans project. Import the project into NetBeans and start it up. Run the project (i.e. the index.html page) from NetBeans so that it starts up in Chrome and you should find that it already works. It contains both MIDI files and MP3 files and visualization, while different for each, works for both. If you examine the index.html page you'll find that it makes use of a whole mess of Javascript libraries for doing a bunch of different things like loading MIDI files, rendering progress bars, and more. Don't worry about any of those libraries. The only ones you need to worry about are those in the ./js/audifly and ./js/audifly_songscapes directories.
Examine the contents of the ./js/audifly directory and you should find the following javascript files with functions and objects used to run the Audifly app. Note that you should not have to change anything in these files to complete this homework:
- Audifly - this object provides access to all app objects as well as providing the responses to Web page interactions.
- AudiflyMIDIPlayer - this object does all MIDI file loading, music playing, and MIDI note event relaying to the custom renderer.
- AudiflyMp3Player - this object does all MP3 file loading, music playing, and timed frequency and time domain information retreival to be forwarded to the custom renderer.
- AudiflySongscape - this object simply stores the data for a single songscape, which is a song (MIDI or MP3) with an attached custom renderer. Note that in Audifly, each song can have its own custom renderer if we like, or alternatively, some can share renderers.
- AudiflySongscapeLoader - this object does the work of loading all the songscapes. Note that this is done by reading in all of the data from the AudiflySongscapeList.json file, which is found in the ./js/audiflysongscapes directory.
Now examine the contents of the ./js/audifly_songscapes directory and you should find the aforementioned .json file. Open it and examine the contents. You should find that it lists the songscapes the application is to load and make available. At the moment, it lists a number of Chopin pieces for piano. In addition, it loads a couple of MP3 files. Note that at the moment, all of these songscapes reference the same renderer, RichardMcKenna_Renderer, as stored in the ./js/audifly_songscapes/ directory. Note that the only code you'll need to write will be in your own renderer. So, to start, add a new javascript file named after yourself using the same format as mine (i.e. FirstLast_Renderer). Then, add an MP3 songscape that is an MP3 version of your Black MIDI from HW 5. To start, try out the renderer I've provided you. Note that when you submit your work, songs should be assigned your own custom renderer as described below.
An MP3 Visualizer
As we discussed in lecture, Web Audio does have a type, AnalyserNode which can give us frequency and time domain information about a song being played using that API. This information is being fed to the provided renderer, so all the Web Audio has already been setup for you. All you need to do now is use the data to render something interesting. Again, in this part of the assignment you should be rendering for an MP3 version of your Black MIDI from HW 5. So what should you render? Well at the moment the renderer just renders frequency columns, and doesn't even make use of the time domain information. In this part I want everyone to be creative in constructing their own renderer that will look original. The requirements for this are as follows:
- It must use frequency data for rendering - with each step, you are given the frequency bin information, this must be used to update what gets rendered.
- It must use time domain data for rendering - with each step, you are also given the time domain data, this must also be used to update what gets rendered.
- Rendered Object Memory - the MP3 renderer I've provided has no memory. When a step is completed, the data is thrown out and the next frame uses all newly acquired data. This is different from the MIDI renderer, which stores an array of notes and updates them and renders them each frame. In your own MP3 renderer, maintain objects that will live for many frames and will be updated accoring to the musical input. How you choose to update and render them is up to you, but it should look good and be interesting.
- Rotated, Translated, and Scaled Objects - with each step, rendered objects should be rotated, translated (i.e. moved), and scaled in some way as a result of the data gathered. Note that the easiest way to do this is by using the rendering context's rotate(), translate(), and scale() methods.
- Background Rendering - you may do what you like with the background as long as it works well with your renderer.
- Different music, different results - again, note that your renderer must be responding to the music. So, hooking it up to different music must provide different results that reflects the changes in the music.
Again, try to be creative with your MP3 visualizer. At the end of the semester, I'll be posting all of them online for everyone to enjoy.


Web page created and maintained
by Richard McKenna