I'm gonna live, till I die!
I'm gonna laugh, ‘stead o’ cry !
I'm gonna take this town,
And turn it upside down,
I'm gonna live, live, live, until I die !
I don't know why this particular piece of lyric stuck with me so closely throughout Sinatra, but the song really resonated. Sinatra “the production” felt like a high-stakes game of survival — survival for a show, an idea; survival for a working relationship between collaborators; survival for individuals committing insane hours to create something that was more than the sum of their parts. So, a song about cherishing the moments of vital, hair-standing, living-it-up life became my anthem, sung sotto voce day in and day out. Did anybody say OCD?
Sinatra, the man, needs no introduction. His was a life and career that defines the high water mark for success in popular music, film, and culture. And his death was not enough to sate a public hunger for Sinatra live or for a desire on the part of producers to provide that experience. With the collaboration of the Sinatra Estate, producers James Sanna and Joshua Rosenblum had set out to do just that, and in the glittering crown of live venues, Radio City Music Hall. The show, Sinatra: His Voice. His World. His Way, which ran there in 2003, enjoys its own rich history of tragedy and triumph. At its conclusion, one thing was certain for Sanna and Rosenblum: they were going to try again.
The selection of David Leveaux, a director with solid credentials in London, and his frequent collaborator, scenic designer Tom Pye, came first. The two were conveniently situated for a production in the West End. Leveaux and Pye had also just come off a series of high-profile Broadway shows, making them bankable for such an upscale property. Also bankable and well located was the inimitable Andrew Bridge, who agreed to design the lighting.
Now we swap our world view almost 180° to Seattle. I had a message on my desk from Josh Weisberg asking me to call a gentleman named Josh Rosenblum. We were working with Scharff Weisberg on a piece of corporate theatre for Nokia at the time, and I thought this conversation was going to relate to that.
When I got on the phone with Rosenblum, the beginning of the conversation had that strange vertigo that goes on when two people begin talking about very different things within the same context. I was shocked when he launched into an explanation of the Radio City production of Sinatra; that there was to be a new production; and that we were invited to design the projection and multimedia elements. Wow.
Rosenblum described a show that could be designed from the outset to be simpler, more mobile (for later touring purposes), and could scale to run first at the London Palladium and later on tour or in other sit-down situations. He was looking at the long view of things, and it seemed sensible. I was excited and wanted to do it immediately but demurred for the moment to discuss it with my peeps at MODE Studios.
I hadn't seen the Radio City production and didn't want to at this point (it was available on video). I felt confident that we could, and would, take the show in a new direction. The other sobering factor was time. We had begun our discussion in September. Rehearsals would commence at the end of December. We would begin previews mid-February. This was terrifying. We knew that the production would be using multiple screens, necessitating the creation of multiple layers of content for the show, not to mention the sheer scale of available content from the estate that we had to try to come to know and select.
Nevertheless, when Rosenblum finally came down to asking, “So do you have the stomach to try it?” I answered yes. It was a big challenge, but it wasn't the first. We all knew this would be unlike anything we'd done before.
Getting Sinatra On Stage
The first task was to figure out the production's core. While I pushed Nokia through to the finish line, my wife Colleen had a series of marathon, all-day meetings with Leveaux, Pye, and Eli Gonda, the associate director. Pye had constructed a model of an elegant, reflective black space, dominated by a tracking bandstand with rear lifts. Now Colleen, Pye, and Leveaux began sorting out the possibilities for media, scene to scene. The fundamental question was if the challenges of this design could be best addressed with projection. Settling on the physical screen “vocabulary” would dictate much of the rest of the process.
Leveaux was convinced that this show could not be about Frank or a tribute to him. Rather, it had to actually be him performing. We had to find a way to convince the audience to forget that Frank was only there on screen and to transcend to a place where Sinatra was present, in the room, doing the job himself again.
To that end, we had an enormous volume of Frank footage. The fundamental premise with Sinatra's footage would be to remove him from original backgrounds and place him in a new “space” of our own devising. The prior production had also done similar work, and we had the benefit of about six numbers that had been filmed on 35mm and been rotoscoped.
The films had been found and painstakingly restored by Keith Robinson. He had worked with projection designer Linda Batwin of Batwin and Robin and the estate on the Radio City production. He had rejoined the circus for a second try, and his already comprehensive knowledge of available Frank Sinatra footage proved critical.
Leveaux had decided on a show order and had selected which Frank Sinatra performances he wanted to use. The source footage was an amalgam of old broadcast kinescopes, digi-beta masters of various Sinatra TV specials, as well as scanned frame stacks of 35mm footage. Obviously, the higher quality sources were better for rotoscoping, but we would need to find a solution that also worked well with the lower quality, older recordings. Enter Imagineer Systems and Pixel Logic.
The Art Of Rotoscopy
The art of rotoscopy has existed for almost as long as moving pictures. Essentially, it is a technique where certain elements are cut out of their original filmed backgrounds and placed in others. For the balance of time that the technique has existed, it required each frame to be individually “painted” to allow this. Getting the outline right so it doesn't jitter and jive requires painstaking attention to detail. Recent software innovations have made this somewhat less torturous, allowing the practitioner to work on every 10th or 15th frame, with a computer doing the heavy lifting in between. Still time consuming and difficult, the process was efficient enough to the point where we thought this should just be able to get done in time for press.
To do this, MODE Studios outfitted itself with several new compositing, rotoscoping, and matte tools. First was Mokey, an application tailored to remove objects from backgrounds. Imagineer Systems, the creators of Mokey, felt that we were best served by a new application they had created, Motor. Motor had the core rotoscoping features of Mokey. With the Motor code being so new, we were essentially alpha testing it. Our lead editor, Brandon Oosterhof, had flown to London for training, but it was evident that this was going to be tough. We needed more assets to focus on this problem. Imagineer offered to directly contract to accomplish the bulk of the new rotoscopy, which was immediately welcome, but there were some specific rotoscopy challenges that needed solving.
One solution was to place a particularly difficult song in the hands of Hollywood veterans Pixel Logic. Pixel Logic would be taking “All of Me,” a number where Frank flew around on a camera crane, and cut out Frank and the crane. We would then be flying this footage around onstage with one of the LED screens. We knew that Pixel Logic could do this well, and it would be a substantial amount of work we wouldn't have to accomplish.
We also invested in Apple's fantastic, high-powered compositing environment, Shake. Shake proved to have rotoscoping and masking tools that were very powerful as well, and we benefited from a wider user base to find helpful personnel. This quest led us to the talented Larry Butcher, a veteran of feature films like Titanic and The Day After. Butcher became our lead compositor on the project, and Shake began to carry a heavy rotoscoping load as well.
At the conclusion of the New York meetings, a presentation was made to the producers illustrating the design and show concepts. The show would use an array of moving screens, most of them irregular rectangles. The screens would be used kinetically and in multiples, drawing on a layered aspect that paid some homage to Mondrian and Rothko. Ultimately, five of the screens would utilize projection, one of them being a massive-scale, full-stage rear-projection sheet. There would be two 6mm Barco ILite LED screens that would fly around and could join to form a larger surface. We would also be placing projection on other unusual surfaces, including a set of sleek silver silk sheers, some truly giant balloons, and full-stage sliding panels dressed in black velvet. Finally, upstage behind the bandstand would be an enormous wall of Barco MiPIX.
The producers had asked Josh Weisberg to design the projection, display, and playback systems for the show. We consulted closely with Weisberg throughout, explaining our goals with the moving screens, the LED equipment, cameras, and the like. His first task was to judge whether what we wanted to do was achievable within certain economies of scale and capability. After he had seen the presentation and we had discussed the show, he did, in fact, deem it “doable,” and we began to think about the physical design. We left the selection of projection, routing, and display gear alone; it was knowledge that Weisberg had in profusion.
We were anxious to come to a conclusion about what our control system would be. Its selection determines a literal semantics for the design process. This semantic applied to how the show was programmed, how the content was produced to work, and how encoding would happen.
The control had to meet several peculiar criteria. It had to play nicely with odd aspect ratio and resolution. It had to natively be HD or better. It had to be able to utilize feedback data from the scenic automation computers in order to track content to moving screens. It had to accommodate live inputs; and it had to be cueable and triggerable coherently and cohesively. We quickly narrowed the field to several options, and ultimately, Weisberg suggested eight Green Hippo Hippotizer HD units, controlled by a MA Lighting grandMA console.
Green Hippo had been working aggressively to incorporate scenic surface tracking into the software, and the platform proved easily tweakable to accommodate the odd shapes and resolutions of the images. The Hippos also performed projection blending seamlessly and beautifully on the big RP sheet, an added bonus. The Hippos all had composite video inputs that could receive a variety of live camera feeds via a matrix switcher. With the selection of Hippotizer, Green Hippo pledged to assure that the screen tracking worked out perfectly and that the software would perform up to the standards of prime time.
To accommodate the moving front projection screens, Weisberg had defined four separate projection planes, or rasters, in the first 15' of downstage space. Each raster utilized Digital Projection Lightning 30SX units to provide images. Because of the size of the projectors and available spaces, the actual locations of the projectors varied. The A, B, and ‘Silks’ rasters all came from the front projection booth. The C projectors were positioned in the Grand Circle at the extreme sides. This necessitated that dual projectors stacked on both sides be keystone-corrected, converged, and blended to get a seamless raster.
Crazy? Yes, but that was only the beginning…