Recently, I was the associate media designer for Magic/Bird, which opened last night on Broadway. Working under media designer Jeff Sugg, we developed a plan of using the NBA's vast resources of archival television recordings, combined with a abstract and sentimental point of view in the design. We knew we'd have lots of highlight reels to choose from, but we also wanted to create a more poetic and rhythmic take on things.
It was quickly discovered during tech that lots of stuff would want to married to sound and lighting perfectly in time and also that we'd need a great deal of flexibility. We wanted to be able to perfectly cue any of the playback machines from any of the other playback machines, and we would also want to playback multi-channel audio from the video computers to the audio computers because of the large amount of footage that already contains its own perfectly synced audio.
For the video team's end of things, we used an Audiofire 8 FireWire Digital Audio REcording Interface, which would allow us to receive MIDI, audio timecode, and also to send eight channels of audio out. Usually, on a large show, the projection cues come from the lighting desk, via MIDI show control, in which the lighting team sends us a cue number that the Dataton Watchout system receives and acts upon. This lets us take cues without much additional work from anyone else; we can program them on our end and only take cues with which we want to fire.
For example, we could take cues from electrics cues 10, 20, and 25 but ignore all the numbers in between, and the lighting team doesn't have to do anything extra, they just have to turn MIDI Show Control on. This helps keep things in sync, but sometimes it requires adding a "dummy" cue that only triggers video, without any lighting change.
Sometimes, however, it makes more sense to be tied to an audio cue instead of a lighting cue, and similarly, sometimes it makes more sense for lighting to be triggered by audio cue instead of called simultaneously but the stage manager. Or maybe a light flashes at exactly seven seconds into an audio clip, and it's just more consistent to have a computer fire it.
We worked closely with the audio designer, Nevin Steinberg and his associate Jason Crystal, to sync audio, lighting, and projections. The audio team used Meyer Sound's LCS Matrix3 platform, which can also communicate in the MIDI protocol, as well as send audio timecode over a re. At points during the show, audio triggers both lighting and video, and then sends timecode, and then shutters the projectors, and then controls individual brightness values on our videos, and then plays an audio cue, all in less than a second--a lot for one system to do in one cue. Some sequences of video effects use alternating control of lighting and audio commands. We'd wait for a lighting cue to fade us up, and then wait for an audio cue to fade us out. It made for a stronger, more consistent show.
I think it is important to stay flexible with cuing practices and to remember that there might be a more logical way to do something. I know that I'm interested in a feature just Dataton Watchout 5.2; DMX recording. That means that the video console can now record a DMX sequence and play it back perfectly in time in relation to video content. Sometimes it just might make sense to play a lighting effect from the video console instead of waiting to press GO.