Sometimes things happen at the center, and at other times the edges. We find innovations happening in the cities: the Londons, the Las Vegas', the (dare we say it?) Seattles. We also find innovation at the fringe: in Wise, VA, or Tupelo, MS, or Bay City, TX, where lack of technology and lack of knowledge are the mothers of invention. It can also be found at a different sort of fringe, in Edinburgh, or Ibiza, or Hamburg, where the edges of popular culture bring exposure to new music, new visions, and new mindsets. We also find new ways and means at the edges of technology, a place where PlayStations are used to switch surveillance cams at major concerts, homebrew computers and open access software are creating the projected visuals, and display and lighting technologies more often found in live entertainment are finding their way into the permanent expression of cutting-edge architecture.

So, we'll focus our lens at the fringe and see if we can't dig out some of the gems that are helping to shape our form and palette.


It's hard to avoid the behemoth cultural ripples that spread whenever U2 chooses to take to the road. Now in considering U2, we are decidedly at one of the “centers” of development, yet it is here in the middle that we find an extraordinary manifestation of the edge (and we're not just talking about a guitar player). In visiting Willie Williams, show designer for U2, we were a little surprised to not see any of the recently trumpeted media server technologies involved in the Vertigo Tour's content and effects intensive production. Neither was there the usual IMAG-associated switching and processing. Instead, a custom software environment — coded by London-based VJ and performance visualists UVA (United Visual Artists) — anchored the video show. The control interface sprawls expansively across two large plasma monitors, showing a 3D view of the array of amazing video “hippie beads” that are the Barco MiSpheres. Sitting next to it is a Sony PlayStation 2 controller, from which Willie is able to switch and move his low-resolution surveillance cameras. That's right, a PlayStation controller. And software coded by a tiny, very smart company.

“It's a timeline device that triggers generative modules and video sequences,” Williams says. “The application is very clever and extremely user friendly once you make the required mental shift away from the media server assumptions. Ash Nehru is the guy who wrote the system. And everyday, he's in here fixing the code. Except he's not here. He has remote control of the desktop, and you'll see him poking about sometimes, cursor moving, code appearing on screen, but he's in San Francisco, or possibly London. Actually, I have no idea where he is, other than the fact that he's not on the tour. Having that VJ spirit in the building is good for the show. We might be able to do a similar thing with a media server, but it wouldn't have the same free-form feeling. On one level this is madness: to reach this level of flexibility, we have built a control setup where there's a slight but genuine potential for the entire system to collapse. With U2, at least we know that as long as we still have the PA, and the house lights, we still have a show.”

This is nothing new for the practitioners at UVA. The collaborative combination of video artistry, coding skills, and technologically innovative use of gear has already provided a glimpse of the future of production design. Their implementation of media, with touring acts like Massive Attack, Basement Jaxx, and Oasis, has served as a laboratory for exploring new software approaches to “performing” visual design live, as well as providing some of the first implementation of the low-resolution display phenomena. UVA was one of the first out the door with designs utilizing technologies like MiPIX and Pixxeline Battens to display media in scenic broad strokes. Their freedom from preconceived approaches to video has made them a sought after expression of the edge.


Let's travel to a different frontier of expression. If pictures can speak a thousand words, then the projected and displayed art of Jenny Holzer represents language taken to the meta level. Her pieces explore the use of language as image, and her implementation of display technologies runs the gamut from large format film, to DLP projectors, to mass arrays of the oh-so-pedestrian monochromatic LED reader-board strip.

Holzer's genius is often found in the tension that comes from the confluence of thought provoking message combined with ironically appropriate location. Seeing and pondering the luminous projections, regarding the origins of life and spirit, projected on the side of a 200-year-old church exposes a huge disconnect. It automatically triggers a dissonance, lending strength and impact to the projected message. The medium becomes an inherent element of the piece's impact. Some of her newest work, at the SoHo Guggenheim, redefines the possibilities and communicative potentiality of total media immersion. She has reopened the can of worms known as virtual reality, but in her medium/message interweaving the viewer finds themselves perhaps in virtual, visual metaphor rather than “reality.” We think this sort of experience also blends what is art and what is theatre, and perhaps points the way to what theatre is becoming.


Drifting back to the center, to the aforementioned missing media servers, we can see some of the edge at work again. All of the media servers benefit from vigorous industry beta testing. But at Green Hippo, they have taken it one step further in terms of the collaborative development of their Hippotizer platform. The Hippotizer benefits from open source development. The programming team for the Hippo is scattered across the globe, with participants adding patches and functionality from London, the US, Asia, and Continental Europe.

The result? The Hippo's summer software update has taken giant leaps forward. Because the workload is divided and conquered and is performed by a team that is guided by principles of efficient, clean code, and ultra cool functionality, the end result benefits. Since the effects in the Hippo are created using an open source development platform that works with many different image editing applications, the other phenomena is a new plethora of video modes and effects also being implemented. Some are generated from within Green Hippo. But many are not, instead coming from the ultra fertile ground of the VJs and other live performance visual artists.

This amazing methodology for developing and utilizing cool code chunks is also fundamental to lots of other arts groups. Look to Troika Ranch (among this year's EDDY Award winners), as well as the amazing work of projection designer Holger Forterer. Forterer's use of procedural triggered animations in Cirque du Soleil's is yet another example of art collaborating with code.


We've been working in Japan lately. The Japanese may be best known for conquering the US markets of automobiles and televisions but, these days, Japan has been flattened out of the manufacturing market, following the US into a global market position where its most valuable exports are its ideas and culture. Tokyo and Osaka are becoming cities of the future in many ways. You wander around seeing extraordinary juxtaposition of very old and very new, mud bricks and bamboo flowing into LEDs and composite surface geometry.

Our trips to Japan revolve around the implementation of lighting and display technologies as architectural features. The Chanel building in the Ginza district shows an entire façade covered by monochromatic pixels at low resolution, a display that makes the NASDAQ look meek in comparison. It doesn't matter that the NASDAQ is full color and showing the latest financial news. What's most powerful about the Chanel building is that it's not all that. It's powerful because it's fundamental, it's textural, it's simple, and it's huge.

Walking around Tokyo's Shinjuku, Rapongi, and Shibuya districts, it becomes clear that it's all been done here. Big water features with massive LED lighting? Check; that went in two years ago. Projection on water or waterfalls? Oh yeah. Building-sized implementation of low-res LEDs? Mmm hmm. Massive projected images on glass walls, surrounded by tons of automated lighting? Yup. And this is not gratuitous application, it's not primitive, hang-a-TV-on-the-side-of-your-building implementation. This is media and light woven together with steel and wood, pixels mixed with mortar, lumens with glass, into one powerful expressive whole. It is scenic design at a breathtaking scale. And it is visual theatre. And it is branding reinforcement. And it is architecture. And it is the future! We've been so preoccupied with the micro level of technology convergence within our industry, that perhaps we are missing the macro-sized convergence that is happening between entertainment design, art, and architecture.


All Designers, Technicians, Manufacturers, Distributors & Entertainment Technology Geeks:

Got an important industry issue to address? Or something you just want to get off your chest? Entertainment Design is always looking for more contributors to its monthly columns. If you can write and want to share your views, send your ideas to: