Print Page




Author Benjamin McFarlane created this piece using the bugged-out "lopsided synth" he made using the Audeon UFO virtual instrument.

You can't escape the synthesizer. And why would you want to? The synthesizer is a powerful, mysterious and monolithic tool of unbridled sonic creativity. It looms over the whole musical landscape like an outer space leviathan, and this article is a UFO experiment. However, the UFO in question is Audeon Unique Filters and Oscillators (UFO) soft synth ($139; UFO is a VST/stand-alone synth available for Windows and Mac OS X. A UFO Light version ($22) is also available for Windows.

The experiment entails a creative synthesizer-sampling technique — that is, taking a synth and recording it into a sampler. This technique can add a new creative dimension to sampling synthesizers. The concept starts with the observation that most synth patches — if they are designed to sound good at a specific key range — do not sound as good in all key ranges. What sounds good at B2 may sound bad at B6. To compensate for that, many keyboardists will use multiple keyboards, or they will assign a keyboard split point where one sample set stops and another begins. By assigning different samples to different key ranges, one can pick which samples go where based on how good they sound in a certain range.

To get a better idea, imagine you resampled six patches from a Minimoog and plugged them into your software sampler to give your keyboard performance a little dynamic variety in addition to the typical pitch and mod-wheel tricks you usually perform on the fly. For example, the ranges C0 to F#1, F#1 to C3, C3 to F#4, etc., each contain a different patch that sounds good in the key range it has been assigned.

Imagine further that you wanted to avoid abrupt changes in the timbre of the samples from one keyboard range to the next. There are two ways of doing that. The first is to crossfade the notes of the adjacent ranges together. That is referred to as a positional crossfade and is widely used. The second way — the subject of this experiment — is to adjust the synth parameter by parameter over the chromatic scale, changing the timbre of the instrument by small increments as you proceed up the keyboard. That creates, in effect, a positional “morph,” where the parameters of one patch mesh into the parameters of the next patch by changing as opposed to fading.


To start, I think about the parameters that I want to change across the keyboard. Several of UFO's parameters, including formant and resonant filters and oscillators, can be edited using the sequencer of a VST-compatible host. That is a great organizational and time-saving advantage for the experiment. With a vintage hardware synth, you would need to record the first note, move every knob that you wanted to change, record the second note, move every knob again, record the third note and so on. With a Digital Audio Workstation (DAW), you can set every change for every note before recording audio (see Fig. 1) and then record all the notes at once.

To determine the positional morph parameters I want to change and how to change them, I spend some time with the synth. After picking one preset, I audition combinations of parameter changes to decide which parameters need to end up at what point and on which note of the scale. For example, in the higher register, oscillator 2 should be steered away from the chaotic and bright and more toward the mellow. I take notes of which parameters should be at what values within pitch ranges. The notes could be more extensive, but I'm willing to gamble a little bit and morph some parameters more or less arbitrarily.

Now, I can start setting changes and first need to put the chromatic scale into the sequencer via MIDI. The best way to enter the chromatic scale into the sequencer is to set the sequencer tempo to about 60 bpm and use quarter-note increments to space the notes. The note duration should be about ⅔ of a quarter note (a dotted eighth note) to allow the note to decay before the next one sounds (see Fig. 2).


The next step is to set your pencil tool to quarter-note divisions, so you can write in changes without taking too much time to do it. In Ableton Live, it's easiest to zoom out, keeping an eye on Live's adaptive grid, until it reaches quarter-note divisions. Then when using Live's pencil tool to write changes, the changes will line up with the notes automatically (see Fig. 3).

This is when I must plan for certain parameters to align properly. If I've done my homework, I know which parameters need to be set to approximately what values at any given pitch. For example, if at middle C, a highpass filter at about 1 kHz sounds good combined with a bright oscillator tone with odd harmonics; then I will need to make sure that when I draw in my changes, those specific settings occur at that pitch. I could just blindly make changes in the sequencer window, but there would be no guarantee that any of those changes would sound good. That's why it's important, before trying that, to give the synth a listen and see where you want to push the various parameters.

A rough analogy is animating cartoons. Let's say you're animating a stickman basketball player. You know you want him to be slam-dunking the ball at the 5-second mark because it lines up with your soundtrack nicely, so you plan for it. Think of the notes on the chromatic scale as the animation cells, and think of the synth parameters as the moving parts of the stickman.

When this process is complete for each parameter, take five minutes to listen to the automated changes. Since you can't anticipate how the parameters will interact at every point along the automation, you may notice some ringing or clipping at certain frequencies as, say, the filters and the formant interfere constructively at certain frequencies. Some after-the-fact editing definitely will be needed.


Once you've given the synth a listen and fixed any of the problems from the morphing of the synth's timbre, you're ready to move on to stereo image and distance effects. This technique provides the unique opportunity to customize a sampled MIDI instrument to fight tightly into the mix that it's a part of. This is a distinct strategic advantage of positional morphing: planning where the synth is going to be in the mix and at what pitch it is going to be. Will it be dry and upfront or will it be attenuated with some reverb and delay placing it in the background? Will it be spaced left or right or will the sample sweep left or right?

Beyond the stereo, foreground and background placement, changes can also be made to the vertical spacing of the synth. If you've read Bill Gibson's book Sound Advice on Mixing, you'll know the value of splitting up the mix both horizontally and vertically. To change the position of the synth along the vertical axis, choose a delay effect that can enact very short delay times. When automating this delay in the sequencer, try a range of 3 to 10 ms with an even dry/wet mix. Those subtle changes in delay create the illusion of height.

So now, in addition to triggering samples with morphing timbral properties, the positional morph includes changes in spatial position. Not only is each note associated with its own unique timbre, but it's also associated with its own unique spot in the mix.


UFO's modulation section almost makes this experiment redundant. I say almost because the parameters in the modulation section can emulate, to a degree, the steps of this experiment. Note the light-blue Keyboard button, for example (see Fig. 4). Many synths have keyboard-dependent parameters — parameters that change according to one's position on the keyboard. With this button, we could achieve a positional morph using any parameters we wanted. The only limitation is that there are only two groups of settings on the synth to morph between, and you can only morph between those at a constant rate for any given parameter. So moving up the keyboard, the synth patch morphs from one state over the entire keyboard range. In our experiment, it's a different scenario; the parameters change forward and backward and do so at different rates, so they coincide in just the right way.

Virtually any of UFO's modulation sources can modify the settings I've mentioned here, from the oscillators to the panning fader. Although it's impossible to automate the direction and range of the modulation destinations, it's still possible to modulate the LFO settings. That will add still more variety but should be kept to a minimum because modulation of the same parameter by multiple sources can either interfere with or cancel one another.


The last step is importing the recorded synth notes into a sampler. I recorded UFO as WAV files and imported them into E-mu Emulator X2 software sampler. In Emulator X2, the samples can be sliced and assigned key ranges almost automatically. That saves you the trouble of rendering every little sample.

Now that it's in the sampler and ready for a trial, UFO is an inherently weird-sounding synth, so it appropriately sounds a little weird. I try it over a bass line and some hip-hop beats, and it is the result I had expected. Sweeping my finger up the keyboard, I get an effect reminiscent of a wah combined with the expected panning effects, which sounds really cool. Chords — especially wider, sprawling chords — sound very strange; they don't sound like they're coming from the same source for the obvious reasons that they've been panned to different positions and effected to different distances.

The experiment, though not scientific, seems to have demonstrated that, in the right hands, there is great potential here for novelty and ingenuity. Instead of a perfectly symmetrical synth that sits in one spot in the mix, this lopsided resampling has created a bizarre beast that does not stay in one place. Whether it opens up a fresh creative dimension for synth artists depends only on whether sound designers are patient enough to see it through.

To hear an MP3 of the “lopsided synth” from this article, go to

  Print Page