top of page

"Weathertone"

Audiovisual Weather Display

by Andrew Wiedner

The goal of “Weathertone” was to create a weather visualiser that generated animation, sound, and music based on weather data retrieved from an API. Different aspects of the weather such as temperature, precipitation, clouds, etc. as well as the current time of day would influence the audio and visuals being generated.

weathertone.png

Practical weather apps display very clear-cut readouts of numbers, color gradients, 7-day forecasts, and icons for each kind of weather—rainy, cloudy, sunny, thunderstorms. There might be a better way to convey this data more concisely, but that is NOT the goal of Weathertone. Intended as a supplement rather than a replacement for your daily weather forecast, this interface was intended as more of a passive weather display meant to entertain you with immediate information about the current weather conditions, by translating that data into abstract audiovisual output.

The background for this experiment really only came from an interest in music, and from exploring the Open Weather API. I didn’t really know where I wanted to go with the project, so I just had to start somewhere and this seemed like a reasonable enough launch pad for ideas. I spent a fair amount of time, early on, ideating what kind of audiovisual / representational “display” I might try to make and found a lot of interesting, not-always-successful, unconventional weather displays on the internet. One that captured my interest was a cube that simulated the current (or possibly next day?) weather phenomena in a wordless, glanceable way that quickly summed up the information, although without nuance.

Timeline difficulties derailed the original plan of doing this project in Touch Designer. Instead the entire program was created with HTML5 Canvas and JavaScript. Starting with the audio, I re-used code I had written from a previous project to dynamically generate audio buffers needed to create musical notes. For simplicity, I’m generating a single square wave but also included parameters to adjust the velocity and decay of notes for more dynamic breadth of sound. After I could generate audio buffers I needed a system to schedule when the notes would play, so I decided on a standard tempo of 120 BPM and created a variable for the last measure scheduled, and an animation loop that incremented variables for current measure, current beat, etc. The animation loop would then check to see what the next measure was and if it had not been scheduled would schedule a randomly generated pattern of notes for the next measure. These notes were selected at random from an array representing all the notes in the current key. Once I was able to schedule and generate a single melody, I added two more schedulers for melodies—one an octave higher, and one an octave lower with fewer more sustained notes. Lastly I created audio buffer generators that used random noise to make percussion sounds and created some predefined drum patterns to be used by the scheduler, which are selected at random each time a new measure is scheduled. The result is a somewhat haphazard but interesting 8-bit texture.

Overall I saw this as an effective first foray or square one, leading into further iterations. The most valuable insight that came from the final demo of this stage was the raindrop-like sound of the notes and the accidental juxtaposition of rain and music. This juxtaposition I decided to explore further in Side Effect and Point Of No Return…

bottom of page