Program Terminated. Fatal Exception Fault. Process Error. Not enough memory to complete operation.

Translation: I've croaked!

Don't act so naïve. You know darn well what these statements mean and you've seen them all before. The real problem when your lighting console starts to spit such epitaphs at you. At an inconvenient time it is mildly annoying. To see it after a 19-hour programming session can make you want to perform some sort of ancient Japanese sword ritual on yourself.

In a world where flattery will get you everywhere, these dire warnings usually send a lighting desk on a one-way trip down the nearest flight of stairs. However, such error messages are in reality a form of flattery. They congratulate the winning recipient on the fact that he or she has cracked the secret combination of key presses that render this fine piece of high technology absolutely useless. The task of whittling down any and all possible secret combinations most often lies with a very special group of people known as “the beta testers.”

Beta testers are a unique breed of geek. While most people try to dodge dangerous button presses, the beta tester relishes in the glow of a “blue screen of death.” They seek out computing crashes under simulated show situations so that the virtual reality remains virtual and doesn't rear its ugly head during your show. For a beta tester, it isn't enough to just suggest the implementation of a new tricky software tool. They get their kicks from stopping a console dead in its tracks; the true prize is achieved by finding the elusive, repeatable bug. That's the series of button presses that will immobilize a console's software time and time again. The main reason this is so enjoyable is because repeatable bugs are usually the easiest to document and fix.

Beta testing is very much a two-way street because finding bugs means absolutely nothing if action isn't taken to fix them. In this regard, the software developer more often than not, has the heavy burden of cranking out new software like an assembly line. The real hassle in all of this lies within the lines of digital code that comprise the actual software, which can potentially number in the millions. Change a single line of code and there's no telling where the fix of one bug can cause a domino effect of other problems elsewhere within the code.

Staying on top of all of this is not only a daunting task but is also the answer to the most obvious question. If there is so much beta testing being done, then why has every programmer and designer on the planet witnessed multiple console crashes on their shows? Understanding my previous statement is essential toward having some sympathy for software developers. It is almost mathematically impossible to figure out what could possibly cause a software crash under every possible situation.

I've had the pleasure of being involved in the beta testing process of many lighting consoles that have been developed over the past decade and two console manufacturers stand out in my mind as having been incredible savvy in directing the whole process of testing software. Both MA Lighting and Martin Professional have done “exceptional” jobs.

I put the word exceptional in quotes because I don't want to hear flack from people about other manufacturer's performances with regard to beta testing software. Save the political correctness for when you're up for the Nobel Peace Prize. The fact of the matter is that some people seem to pull off tasks better than others and in my opinion these two groups have really shined on the beta-testing front, and the software that they produce are testaments to that hard work.

In both cases the undisputed key to success has come from not only listening to the market's needs but also by recruiting professional programmers to beta-test their software under simulated show situations. There is a huge difference between having some geek in a cubicle test something verses having some geek at front-of-house testing it. Beta testers who actually work with the equipment find the bugs that are wildly elusive to software writers.

MA Lighting took an interesting approach to its beta-testing process when they were creating their grandMA console. At first they brought in programmers from all around the world to test drive their software. In turn these programmers beat the living hell out of it. MA Lighting responded to all of this input by putting their software development team in direct contact with the beta testers. The result not only got existing bugs fixed quicker but helped to find and eliminate that domino effect I mentioned earlier. Not only did they listen to the people who would be running the console in real-world situations, they also listened to and heavily supported their distributors as to what other customers were asking for.

The true stroke of simple genius came about a year later when MA gathered all the beta testers together at an undisclosed location for a think-tank meeting. This weeklong getaway would later result in the grandMA version 4.0 software. In the end version 4.0 brought forth some absolutely revolutionary console concepts which are sure to be copied and manipulated by their competition for many years to come.

Martin Professional has also done an outstanding job of directing the beta-test process of their new Maxxyz desk. They took a very similar approach in bringing in programmers from around the world, and have produced what I consider an exceptional product. It's incredible to see just how much they have asked for feedback directly from the marketplace and implemented that feedback into a tangible product. However, it's not really the relationship between software and lighting programmer that comes to mind here. What is truly insane is the rate of speed at which software is being produced over in Legoland.

I was given a brief opportunity to see the Maxxyz software in its infancy and was very unimpressed with its direction. Apparently I wasn't the only one who had expressed less than positive enthusiasm, because not more than two weeks later I was asked if I could take a look at its revision. The difference was not only drastic but startling. The writers of their software had completely rewritten the base philosophy of how the console would operate. I was very taken back by the sheer speed of this task; Martin has kept that pace going through every step of their beta testing. Now, when I look back to the version of software that I saw at LDI last year versus what I'm seeing today, there is simply no comparison. The release of Maxxyz version 1.0 is light years ahead of what was shown 12 months ago.

So the next thing I'm going to be taking flack about is what the big deal is about the Maxxyz. People already say to me “It's only in version 1.0 and that means that it can't do this, that, and the other thing that I need it to do.” Those people are absolutely right. But I heard the same thing about the grandMA too, and those boys continue to come shining through with their software updates.

The Maxxyz may very well be missing a feature or five that you consider critical. However, when you look back at the shaky state of the version 1.0 Wholehog II software a few years back, and compare that to where the Maxxyz is today, then there is reason to be optimistic about the next six months and beyond.

And if you're going to argue with me on that point, then you might as well go all the way and tell me what a failure you think the Hog II was as well. Somehow, I don't think you'll do that.


All Designers, Technicians, Manufacturers, Distributors, Groupies, Hangers-On, & Entertainment Technology Geeks:

Got an idea you want to share with your peers? An important industry issue you want to address? Or something you just want to get off your chest? Entertainment Design is always looking for more contributors to its monthly On Lighting, On Audio, and On Projection columns. If you can write and want to share your views with ED readers, please send your ideas to David Johnson at