Much of the content seen during Roger Waters’ The Wall tour is based on the original 35mm animations created by Gerald Scarfe for the 1982 film, The Wall. What makes these images different is that, rather than being inspired by WWII, as many animations in the film had been, they’ve been updated to offer a 3D look and contemporary political and social commentary. For “Thin Ice,” for example, hundreds of real images of loved ones lost at war came from a call for submissions on Waters’ website.

The projection content has to match the wall as it’s being constructed live. “A brick is put into place, and then it lights up with an image,” says creative director Sean Evans of Deadskinboy Design, who, along with Waters, “camped out” for months at Breathe Editing, a New York-based creative studio co-owned by editor Andy Jennison. Graphic workstations were built inside the studio to accommodate the Adobe After Effects compositors on the team, and, after being turned down by outside render farms because their needs were so great, they also created two render farms totaling up to 17 machines at one point. “We had to do so much to pull off the resolution we needed, so we turned Breathe into a 3D house/render farm so our pipeline would work,” says Evans. Each frame covering the 240'x30' wall, as well a 30' rear-projected circular screen behind Waters, is 8,560x1,620 resolution—five HD frames with overlap.

Richard Turner, screens technical director for the tour, has the daunting job of making sure all the video works, no small task with 20 Barco FLM projectors—five stacks of three overlaid HD20s for a total of 15 on the wall and up to five R22+ units overlaid for the circular screen—to be hung and aligned for the brick pattern. Turner also oversees playback each show. “It’s a very, very horizontal area,” he says. “Any bit of the film we used had to be reformatted in some way. We had access to the original negative and inter-positive from the movie and from the 1980 tour, so we were able to have 4K scans made. That was our starting point—very grainy but acceptable. We adjusted the aspect ratio in different ways.”

For “Empty Spaces,” for the iconic flowers sequence, compositors extended the stems in 3D so they creep in from the far left and right side of the wall. For “What Shall We Do Now,” the team blew up and stretched the image across the wall, but left it unstretched on the circular screen. “As we went through editing, we tried several things—stretching the image, using multiple 16x9 frames—and none of it seemed to work,” says Turner. “We wound up framing the main part of the image as large as we could get it, so it was very bold but still readable, and then painted out the sides to fill the rest of the space—a time-consuming process.”

Evans taught himself how to use Maxon CINEMA 4D (C4D) software for this project. Among the many challenges he faced when creating imagery was coming up with something that felt somewhat modern while staying true to the original look of the album and film. For the marching hammer animation sequence, for example, the editors originally updated the hammer design to make it “very photographic and dramatic,” says Evans, only to find that it just wasn’t right. “Sketch & Toon [in C4D] saved our lives on this, because I ended up taking a loop from the movie and modeling a hammer that looked just like the ones in the film,” he explains. “I dropped it into Cinema and matched it, frame to frame, and got the final, vintage version with Sketch & Toon.”

For “Waiting for the Worms,” the background is a columned building inspired by German architecture that Evans modeled in C4D. As the song begins, huge worms move into view and begin writhing in and out of the columns. The worms were all extruded on a spline and animated with the Add-The-Sea plug-in to create random waves, giving the worms their undulating look. “It looks scary, really, and we have the camera push into the columns and follow the worms as they disappear,” says Evans.

“Comfortably Numb” is perhaps the show’s most dramatic number. Waters walks over to the wall covered in multicolored fractal images, touches it, and it explodes into a sea of flying pieces. “I used the Xplode plug-in and MoDynamics [in C4D] to make a hole the size of Roger’s hand grow and expand to slowly create this feeling like there’s a lot of volume behind it,” Evans says, adding that, once the wall “falls,” another building “rises up” out of the ground in its place via projections (a physical falling of the wall happens later).

“With a show such as this—a single immense canvas, which is a continuous, locked timeline—the video programming is essentially done to the cut of the content, so the edit stage is the programming stage,” says Turner. The content, created frame for frame to the incoming timecode from the band playback (operated by Mike McKnight), is locked, cut for cut, for edge of frame accuracy.

“We feed Mike our system genlock signal, which he references his system to, so the hardware time-bases are locked, which is very important when dealing with the extended playback periods—only one live stop per half,” says Turner. “Mike then distributes the show timecode, which most departments reference in some way.”

Even the enormous puppets, based on animated characters from the film and created by Brilliant Stages, are lit by the projections. “A key source graphic was created to hit the bricks around the puppet and cover the puppet itself. This is then a live key using the Barco Encore screen processing system to allow for timing changes, etc.,” says Turner. McLaren Engineering designed and implemented the flying and lift systems for the production, including those used for the massive puppets.

Stay tuned for more about this tour, including more on the projection and lighting.