for orchestra, music by Ian Williams (orchestrated and realized by Andrew McKenna Lee)
first performance: 21 November 2014; Zankel Hall at Carnegie Hall, New York, NY; American Composers Orchestra; George Manahan, conductor
detailed orchestration: solo electronics, 1(=pic).1(=corA).1(bcl).bsn – 22.214.171.124 – timp – 2perc – hp – pf – str
If using Safari and the player doesn't work, try clicking in the waveform of the player window.
I met Ian Williams through a mutual friend, Derek Bermel. Derek, knowing that both Ian and I are guitarists with roots steeped in rock music, thought that we might get along, and asked us to work together to create a piece for the American Composers Orchestra's 2014-2015 season.
Prior to meeting him, I was familiar with Ian's music as part of the band Battles, but having never really collaborated with someone before, was somewhat anxious about the prospect. Derek's intuition, however, was spot on: Ian and I ended up getting along quite well, and we quickly established a productive and enjoyable rapport.
Making Clear Image involved Ian writing the music and sending it to me as both MIDI and audio stems from Logic. These stems contained both his ideas for the solo electronics part, as well as mockups of the rest of the orchestra. I would then work from these stems, writing and fleshing out the orchestration, and finding some occasional room throughout to make a few of my own contributions to the fabric and texture of the music.
—Andrew McKenna Lee
What's the difference between finely edited material on a multi-track recording and scored material in sheet-music form to be played live? Edits and unnatural transitions are actually natural to computers. Jumps in tone and texture, even when bearing the marks of the artificial (a person couldn't do that!), are easily done, and modern ears used to hearing such things. If you consider a multi-track recording where the music actually isn't played, but just assembled, to be a lie, or an illusion, but you don't think that's actually a bad thing, then you might want that artificial quality and consider it an "enhanced" reality. How then would you preserve that improvement in music that is to be played by real musicians in real time and space? You might use traditional threads of melody and harmony to make the jump from one of these realms to the other and see what happens.