Sunday, October 2, 2016

The Circle Game

“I’ve been here before.  I feel like I’m going in circles.”  Who has not had that experience?  Sometimes it seems like we’re walking the same path, over and over.  On the other hand, it’s also a truism that “you can’t step in the same river twice.”

I’ve come to see how both of these can be true, that we can feel we are going in circles and yet also never be the same.  We can appear to be going around and around, repeating the same behaviors, having the same experiences, but, if you think about it, we are not, because our hindsight belies a new experience.  Our experience of now as being like the past includes that past, layer upon layer.  Rather than going in circles, a better metaphor might be that we travel in a helix, each trip around looking in some ways the same, yet moving inexorably forward  Sometimes there are big, long patterns in which the helix is high frequency (loops are close together) and high amplitude (loops have a wide diameter), sometimes low frequency, low amplitude, sometimes different combinations.  Life is a series of helices, nested, stacked, winding and unwinding concurrently and in series.

This piece is something I first imagined in the winter of 2013/14; its long gestation has been due both to the technical challenges in realizing it and to an evolution in my musical sensibilities.  It’s not meant to be directly allegorical, as several of my pieces have, but rather I sought to express musically the way the patterns of our lives can play off of each other, creating new patterns and disrupting others.

It’s best listened to with a good set of headphones or between well-separated speakers.



Building the piece started with an attempt to create as literal a helix as can be sonically represented in Max/MSP (indeed, I purchased the software specifically to make this piece); the result of that effort was the patch used to produce the low drone heard throughout the piece.  Apart from the obvious left-to-right panning and the conceit of volume and timbre working in tandem to approximate distance, careful listening will reveal that the pitch of this drone increases slowly throughout the piece, ending about a minor third higher than it started.  A strong reverb (a plate emulation, built by Tom Erbe in Max/MSP) was applied to the drone as well, creating a kind of shimmering in the overtones that I especially enjoy.  Two other voices were created in Max/MSP, beginning about the eight minute mark and again at about 9’ 30”, the first based on the same helix patch that was used to produce the bass drone and the second a sped-up Shepard tone that ended up sounding a bit like an air-raid siren.

In recent years, I’ve become increasingly interested in bells; I am fascinated by the stretched and otherwise inharmonic overtones they typically generate.  I’m especially interested in alternative means of playing them, such as with “singing bowls” and bowed gong.  Using mathematical models of the physical properties of various materials, good physical modeling synths (PMSs) can produce surprisingly natural bell tones and allow them to be played in unconventional ways, including some that would be impossible in the real world.  Logic, the DAW I use, has a native PMS, but I found it to be limited and not very natural sounding (sometimes you don’t want “natural” sounds, but it’s much easier to get a synthetic sound from a good PMS than a natural sound from a poor one).  All of the voices in Helix, other than those mentioned above, were created using a PMS called Chromaphone 2.  It’s not terribly intuitive to use, but I’m very happy with the textures I’ve been able to create with it; for most of them, I started with a metallic bar or plate as their primary “physical” component and then “excited” them using a bow-like function.

I’ve generally found Max/MSP to be much more intuitive (which is still not very) in the Max (control functions) objects, than the MSP (digital sound production) objects.  As a consequence, I have, thus far, used it mostly as a kind of robot-musician, playing virtual instruments, rather than as a means of generating timbres, as I intended when I initially started working in it.  This is reflected in Helix both in the relative paucity of MSP-generated sounds (and the simplicity of the ones there are) and in my use of a Max-created MIDI controller that I used to give the “bouncing bell” Chromophone voice its “bounce.”  Other performance-related aspects of the piece were either controlled from a keyboard or in Logic’s automation.

Finally, I recently was introduced to VintageVerb, a reverb plug-in.  In some of the Chromaphone-generated voices, I used a touch of the program’s native reverb, which is very nice, and I used Erbe’s reverb, mentioned above, for the bass drone, but I wanted an output-level reverb to tie the various voices together and give the piece a sense of expanse.  In my attempts to implement this, I became increasingly dissatisfied with Logic’s native reverbs, which generally sound muddy to me.  I learned about VintageVerb through a Max/MSP newsletter and initially considered it for another project -- among many other wonderful features, it has an outrageously luxurious 70-second reverberation period -- but when I began playing with it, I discovered that it was capable of producing a much clearer, smoother, and more natural sounding reverb than what I’d been able to get with Logic’s reverbs.  The amount of reverb I added using it is small, but it provided what I felt was a necessary final touch.

I’ve learned a great deal putting this piece together, both technically and musically.  It’s hard to separate how much of that was as a result of my work on it and how much was a result of how long it’s taken.  In the end, though, I’m very satisfied with it and excited about the new directions I have been inspired to go as a result.