Feb. 29, 2016

This is the pattern I use for mastering algorithmic electronic music that I write.

Algorave mastering heuristic block diagram

First, each audio source or channel Sn is given a random (sometimes hand tuned) delay of 0 to 20 milliseconds on either the left or right channel in RPn. My friend Crispin put me on to this technique, which gives each audio source its own psycho-acoustic space in the mix, probably due to the Haas effect.

Next the resulting sources are summed together - separately for left and right channel. Here you can also run the combined signal through a high pass filter set at 5Hz to remove any DC offset present in the signal.

Then the combined signal is passed through a reverb with mostly "dry" signal - maximum 30% wet as a general rule of thumb. Adjust the reverb to taste. I'll usually make it a bit long and airy but subtle.

Next we win the loudness wars. This is dance music and we want people to dance, so it has to sound powerful. To acheive this we do something horrible: measure the RMS - the power of the sound - and then amplify until the power of the signal is normalised. Here is the "auto compress" Pure Data patch I use for doing this:

Pure Data auto-compress patch

The env~ object here is a simple envelope follower and the source code is here. The dbtorms function source code is here. The possible magnitude of the power correction is limited by the clip function which does what it says on the box, and the resulting multiplier is smoothed with a 10ms rise and fall time (line~) to get rid of sudden discontinuities. Only 30% of the resulting power-normalized signal is mixed with the original signal.

Finally, run the mixed signal through a soft-clipper before sending it to the speakers. Soft clipping is a good idea because the power normalisation step above will push the peaks up above 1.0 and we don't want harsh hard-clipped distortion to be audible.

The soft clipper I use (probably incorrectly called "sigmoid" in the diagram) is simultaneously a compressor to get that extra punchy sound:

2 / (1 + pow(27.1828, -$v1)) - 1

Where $v1 here corresponds to your vector of incoming audio samples.

Hopefully this method doesn't break any international treaties or anything.

Enjoy!

Jan. 23, 2016

rhetoric3.0_web.jpg

I am playing algorithmic rave music at Rhetoric in Western Australia.

  • February 5th, 2016
  • Game city { Raine Square / Perth Train Station }
  • Doors open 6pm
  • $10 Entry
  • Free arcade games
  • With: chr15m, cbat, marko maric, atomsmasha, kataplexia, amnesia, polite society & free arcade games.

Rhetoric 
Photo

Dec. 24, 2015

orchids_201520185417.png

"An astronaut stranded on an alien planet, with only a few minutes left to live."

Orchids to Dusk had a powerful effect on me.

I dreamed about the game the night after playing it.

The creative power of code is the microwave background radiation of my subconscious and this game made me notice it again in a visceral way.

Inspirational.

Dec. 16, 2015

Over the weekend I built a tiny game for Ludum Dare #34. Here it is:

Instructions: grow the white square's heart by clicking and dragging to the other squares.

Link to the game here.

Source code here.

Play/review/rate it here.

Dec. 11, 2015

Zero Asset Game Mockup

A "zero asset game" is a game that does not use any external art assets.

Game art is instead generated procedurally or by using artifacts of the rendering environment.

The following is a screenshot of a tiny game engine I built a little while ago in ClojureScript.

Tiny CLJS Game Engine Screenshot

The renderer runs on Facebook's React library so it is just a couple of lines of code.

I've spread it over several lines here for readability:

; DOM "scene grapher"
[:div {:id "game-board"}
  (doall
    (map
      (fn [[id e]]
        [:div {:class (str "sprite c" (:color e))
           :key id
           :style (compute-position-style e)
           :on-click (fn [ev] (sfx/play :blip))}
        (:symbol e)])
      (:entities @game-state)))]

The sprites are utf8 characters which are instantiated like this:

(make-entity {:symbol "◍"
              :color 0
              :pos [-20 300]
              :angle 0
              :behaviour behaviour-rock})

The function behaviour-rock here gets called once per frame and returns the new immutable entity-state for the next frame.

When you click on something the blip sound is generated procedurally in the browser using jsfxr.