The present building of London's Royal Opera House is the third theatre on the Covent Garden site, since the original one burned down in two separate fires in 1808 and 1856. Its rich history began in 1728, when John Rich commissioned The Beggar's Opera from John Gay, and the success of the venture provided capital to build the first Theatre Royal. In its first 100 years or so, it was primarily a playhouse, and the first serious musical works performed were operas by Handel from 1735 until his death in 1759. It became the Royal Opera House in 1892.
During World War I, the theatre was requisitioned by the Ministry of Works for use as a furniture repository, and in World War II, it became a Mecca Dance Hall. It reopened on February 20, 1946 with a performance of The Sleeping Beauty in a sumptuous new production designed by Oliver Messel.
The Royal Opera House, as it stands today, opened on December 4, 1999. Its reconstruction actually began on-site in 1996 with the last performance in the old house in July 1997. In three years, the most inadequate of the great opera houses of the world was transformed, not only for audiences, but equally for performers and the hundreds of other people who work here. By remaining on its historic site, the Royal Opera House has enriched Covent Garden and reinforced its status as part of London's cultural heartland. It is a world famous center for the performing arts and home to the UK's prestigious Royal Opera and Royal Ballet companies. The greatest names in ballet and opera have performed on its stage, but behind the scenes, an equally talented army of technicians, designers, and producers are responsible for bringing each production to life.
The theatre may be steeped in tradition, but the Royal Opera House's Victorian façade hides a unique and thoroughly modern innovation by becoming the first opera house in the world to make virtual reality a key part of its production design process.
One challenge in achieving this was the modernization of key production processes including lighting and lighting design. In the past, setting up the lights required a full crew and hours of intricate work on stage to focus the 300 or so conventional lights over and around the stage. However, as productions became more complex, and the schedule grew tighter, the theatre's technicians needed a new way to work, and the partial solution in the new house was the introduction of an overhead rig with mainly moving lights. That started to solve the problem, but still there was a lot of time needed for rehearsals and setup.
Another solution was innovative: a virtual version of the Royal Opera House in which lighting and set changes can be designed and animated at the click of a mouse.
The house team, led by head of technical Geoff Wheel, David Harvey, and me, invested one year in producing a highly accurate 3D model of the Royal Opera House's auditorium using advanced software including Autodesk Maya and AutoCAD. This allowed us to clone the main stage within the virtual reality (VR) facility, creating a dedicated environment in which production designers are now able to focus on perfecting the look of their shows.
The way the facility works on new productions is that, as soon as the AutoCAD drawings are finalized and ready to be handed to set builders, I get a copy of those files, and I build an accurate 3D model in Maya of the set with all the textures and requirements. The 3D model has to be optimized to be used in the 3D virtual environment, meaning that the model has to be polygon-efficient in order to have realtime speed in our pre-visualization. If there are too many polygons, the system can grind to a halt, as it has to do a lot more calculating. We came to the conclusion after experimenting and testing that, if the entire model is kept to fewer than 100,000 polygons, we have a very efficient model. A lot of the 3D details can be achieved by efficiently using texture mapping to create the illusion of a 3D shape.
Once the entire 3D model is finalized and optimized, it is exported to ESP Vision in our virtual 3D model of the house with the entire lighting system. Here, it can be either saved as individual scenes or acts, or, as we started to do recently, we can use the lighting board to do scene changes by assigning DMX channels to the pieces of the set. Thus, we can run lighting cues and do the scene changes all at the same time, giving us a view of how it might look.
We decided to use ESP Vision software for our pre-visualizations, as it gives us instant feedback through realtime rendering of how the lights behave, and having the ability to see the shadows rendered gives us, and especially lighting designers, that sense of reality. We find ourselves many times immersed in this virtual world, and it's easy to forget that this is a virtual world. The most satisfying moments are when, after saving the show and loading it on the main house lighting console, we start bringing up all the lights focused in our VR Studio, and the match is almost perfect in 90% of the cases. The other 10% can be put down to the fact that, in the virtual world, sets are within perfect location designed by the set designer. In the real world, that piece of set might be a few inches away, or the lighting bar higher by a few inches. And light throw just a few inches off could mean a foot or two difference where it's landing, which is exactly where we get that 10%.
It's satisfying sometimes staying in the lighting control box, watching the lights in the real world moving around the set and then comparing that image with the virtual lights moving around the virtual model on our second ESP Vision system in the room. The match is so close. Many times our board operators use the monitor to focus their moving lights. And the big bonus? The operators can “fly” virtually above the stage and even go inside the luminaires to see where the light is focused. That's so useful when they try to focus one of our many moving lights, and the fire curtain is in or there is some other obstruction. Many times, the operator asks whoever is in charge of focusing where he wants a light, and he is able to put that light there “blind.”
The way it works with our revivals is, in many ways, the same. If we don't have the model of the show already created in 3D, we build a model, start bringing up cues, and see where the lights are focused. We can take screen shots of the position of the lights to be archived for future reference. As our rig changes over the years, some old productions may have been done with an old rig layout, so we can load the show and the old rig in the virtual environment and see where the lights that were replaced are focused. We take screen grabs, and after all the lights are archived, we load the show with the latest lighting rig. By looking at the screen grabs, we can match the focus accurately.
Having this ability to archive the position of our moving lights gives us a good database of moving lights focus positions, as there isn't yet a lighting board to have this facility when printing show info. This is very useful when we do co-productions, as we can give our partners a full package with comprehensive information regarding focus notes for the entire show. In the future, with more theatres jumping on the bandwagon of virtual worlds, we can simply email the file with all the lighting cues and the 3D model to be loaded on a workstation, and another team can run the show virtually, taking notes and seeing what the lights are doing. Spooky, but it can be done.
So, why light opera virtually? The answer is simple when you look at all the advantages. Time saving is a huge one, as our theatre is a repertoire house, so we have a very large number of operas and ballets coming in and out all year. Sometimes, we have four or five shows running concomitantly, and time onstage is so precious and short. By having more of a virtual opera house, we can play with our 3D models in realtime, experimenting with what can be achieved on stage, all before the real set is even built. So, if the designer sees a problem with some sightlines or really any lighting issues, we can quickly modify the model and send the new version of a particular piece of the set to the set builders to be rectified. The last thing we want to do is go on stage and find that some part of the set is useless or in the way and then pay more to have it modified.
Saving labor costs, of course, is another immense benefit, as you don't waste time on stage focusing the moving light rig with 40 technicians hanging around waiting with only the lighting board operator and the lighting designer working. I think using pre-viz also fuels creativity, as lighting designers can have most of the rig focused and spend more time concentrating on the design side of the lighting concept.
The system at Royal Opera House was originally devised as a lighting design tool, and the VR facility has also been ideal for coordinating scenery changes and moving set pieces. The virtual stage can be viewed from any seat in the house, allowing designers to evaluate their work from the audience's perspective before it goes live.
Our system uses an AlienWare Area-51 ALX computer, with Intel® Core™ 2 Extreme QX9770 3.2GHz overclocked to 4.0GHz 12MB cache 1,600MHz FSB with a high performance liquid cooling system; dual 1GB NVIDIA GeForce GTX 280 graphics processing units; a NVIDIA nForce 790i Ultra SLI motherboard; and a 4GB Corsair Dominator overclocked dual-channel DDR3 SDRAM at 1,600MHz.
We're constantly pushing to achieve more fluid animations and more detailed models, so the increased speed that SLI gives us is very important. Although SLI is not implemented yet in ESP Vision, we use it a lot with our animations on Maya. “This is not technology for its own sake. It would literally not be possible for the opera house to function as it does now without our VR facility,” said David Harvey to one question from one of our partners.
Maya and ESP Vision technology mean our virtual stage has all the properties of its real counterpart. We can move lights, create special effects, and see shadows, all in realtime, and this high level of realism means the design process is still very intuitive. Where the lighting and set movements for a complex production like The Sleeping Beauty would once have been hugely time-consuming, the design team can now come to the VR room and realize a vision in just a few hours.
This technology doesn't just save time and money; it has actually improved the artistic content of our productions. “I have actually programmed in George's cinema, focused a light on a stool, put that disc downstairs, and hit the stool spot-on!” laughs lighting designer Mark Jonathan. “We also ran a major sequence of The Sleeping Beauty like a movie. We added soundtrack and cued in a rolling succession of backcloth and lighting changes. We could show this to the choreographers, the stage managers, and the fly operators and agree on the end result. We didn't have 40 people standing around the stage while we sorted it all out. It's the most colossal saving of time, and I don't know of anyone else in the world who has developed a system so advanced.”
Steve Nolan, lighting designer for British Academy of Film and Television Arts (BAFTA), notes that the VR suite helped his production and his design. “I was able to work out which lights went where and to do what job, where they could focus, and actually do some preset focusing of many of the instruments onto our virtual stage set,” he says. “This gave me a very big head start on the limited programming time that I had, as I had already discovered and solved things that were problematic and established a lot of my focusing. This meant that I was able to immediately get on with the business of plotting cues when the lighting system came online. The production team also found the suite useful in working out audience sightlines, moving cameras and lecterns around to get the best shots, and establishing the height of video screens and masking.”
And another benefit? It doesn't hurt that more pre-viz means less power consumption, and we all know that's better for Mother Earth.