~2015
Ambient | Web Audio | WebGL

I don't remember how I came to see Salt - a photography project by Murray Fredericks - but I liked it, and it was the inspiration for what would become Horizon.

I started playing with ideas in bits of spare time in October 2014, and was 'done' by the following March.

But…

There were parts I wasn't particularly happy with. It didn't quite do what I wanted; it wasn't what was in my head. (Also, the link in that tweet no longer goes anywhere - my hosting changed since then - however, that version is still available)

I wanted more interesting backgrounds. Though happy with the look of the generated output, I thought it would look better if the effect was animated over time.

As for the audio, well, I did't really like the manic piano notes driven by the flock - not as much as I thought I would anyway. Something subtler was needed. Overall, I wanted Horizon to be something calm and unobtrusive.

Generating seascapes

The backgrounds are generated by taking some 1-pixel vertical slices from a source image and interpolating between them horizontally to fill the space. Due to the 'horizon-y' nature of the source images, a lot of subtle variation can be created in the output by moving the slices along the x-axis over time.

Debug output where lines show the slice sampling positions (black) and the target positions they're moving toward (red).
One of the background source images - the images are all the same size and have the horizon in the same position.

The first version just used pixel read/write operations on bitmap data to generate the output, and that is not a particularly speedy operation. A new image was generated in a web-worker and streamed across to the main context while the current image was slowly faded in. It was a subtle effect, and I did like it, but knew it could be improved.

However, I didn't get around to looking at it again for a few months. I hadn't been thinking about Horizon, but one evening, happened to open Processing... and an hour or so later, had the basis of a new version.

While playing around with some custom filters, I ended up recreating the effect in a glsl fragment shader. Rendering on the GPU is an obvious solution really, it does texture sampling and interpolation with ease, and means that the properties of the effect can easily be animated in realtime.

With a few minor changes, the same shader code would work in a WebGL environment, so I started a new version of the project that would use pixijs as the rendering backend. Much like Processing, pixi lets you use a glsl fragment shader to filter its display objects (when targeting WebGL), so it was a reasonably easy process to convert it and re-target the rest of drawing code to use pixijs rather than OpenFL.

Since I was changing the codebase around, I switched the audio to use Tones, a little Web Audio library I've been working on/playing with.

Sound

I wanted the audio in the first version to be driven by the movements of the flock - where notes were triggered by speed, separation, and direction changes. I wouldn't say the result was awful; sections with structure and rhythm would emerge over time, but in general I felt it was a bit too chaotic and didn't really compliment what I wanted to be a much calmer scene.

So, when updating everything else, I stripped all of that out and replaced it with a much more relaxed sea soundscape. The end result is built by continuously cross-fading between a set of samples. Sounds are selected at random from a pool of 18, and each new one ensures its fade-in time is the same as the fade-out time of the outgoing one. It's simple, but surprisingly effective.

I used Reaper to create an audio-sprite - all of the sounds in one file - and region data for the samples was exported as a CSV file. The CSV data is parsed into a typed config object at compile time for use in setting up the sample playback (note: Haxe macros are great).

Source material

The photographs that form the basis of the generated backgrounds were taken looking out from the SE coastline of Dumfries and Galloway.

The audio-sprite was assembled using extracts of recordings made on the Brough of Birsay by pike67.




As a little bonus for reaching the end, here's a glitchy mistake that cropped up during some early work on the background effect...

It's quite wrong, but I quite like it.