What is in this article?:
- In Control For Bon Joviâ€™s Because We Can: The Tour, Part 2
- Integrating Lighting and Video
Integrating Lighting and Video
LD: Any other thoughts about the integration of lighting and video on this control system?
DS: It is an impressive the amount of data moving across that Art-Net network, and it offers us a lot of options since both lighting and video are controlling it at the same time. It allows us to put live IMAG in those headlights with minimum amount of delay, considering it is a live input going from media server and then merged in the lighting console and then going to the LED fixtures. You can't really tell; the lines are incredibly blurred between those being lighting fixtures or an LED wall.
With the Chromlech Elidy fixtures specifically, we use Mbox to pixel-map them and send an image out. So if we want to do complex animation or timecode-driven content to the headlights, we can do that, and that merge happens inside the lighting desk so lighting can paint on top of them, use them for a traditional blinder or do another layer of chasing that was tied in with what they were doing. Read more about the car setpiece.
The reverse of what I just explained is when the lighting guys have the ability to control the CFS Multi Tap Server. This opens up some interesting ideas. Multi Tap is a core piece of software that CFS developed back for The Beastie Boys. It is built around the idea of a tap: a bunch of small screens that reside in one raster and coming up with an effective way to control those individual screens where a typical media server may run out of layers. We used the Color Block mode inside Multi Tap extensively for Bon Jovi; it lets us treat each grille tile as a RGB fixture. Each one of the V-9 Lite LED tiles in the grille and each one of the tiles in the turn signals can be targeted out and used like a light--like an actual RGB fixture.
That is what led to the idea of creating the virtual GLP X4 impression lights, where we could have the lighting guys coloring them, and then, inside the Mbox, we would do a multiply layer addition where we would take that color coming in off the live input and merge it on to the texture coming from the Mbox. That look is for “Born To Be My Baby,” where a picture of an X4 light is created on all the V-9 tiles, and then the lighting guys send the color information to the media server; basically they control the media server to change the color. It makes it look like a wall of lights, virtual X4 fixtures. The actual grille configuration is a V-9 tile next to a real X4 light next to a V-9 tile, next to another real X4 light. So when you create these virtual X4s on the tiles, it becomes this huge light wall. It was a really cool look and it is a great example of the high level of integration there is between the video and lighting control.