Visitors to the uniquely shaped Hall 11.0 booth enjoyed a breathtaking view facilitated by 6 projectors shooting perfectly mapped video onto a four-sided surface. Total screen space was nine square meters with the distance between the projectors and screen surface a short 1.5 meters. The perfectly blended projection was made possible due to the software's new 3D Mapping Engine and patented Projection's automatic adjustment method based on the inverse transformation.
The 3D Mapping Engine is used to solve common video projection problems such as shooting video or images onto objects of any geometric complexity. The process can be divided into several simple steps: Setup virtual projectors in LIGHTCONVERSE just as they are installed in the real world, use the software's Material Editor to map video or images onto a replicated 3D model of the real-world object, and, finally, send DVI signals from the computer to the real-world projectors (3-15 outputs, depending on LIGHTCONVERSE version). The virtual projectors simultaneously function as virtual cameras, thus allowing the camera to "see" its correctly mapped virtual world and output its signal to the real projectors. Video sources can be input live from any media server to LIGHTCONVERSE or from on-board .avi files.
Also shown at the show was the LIGHTCONVERSE SERVER-STUDIO, a new hardware solution that can visualize up to 1536 fixtures (96 Universes) in real-time in conjunction with the 3D SHOW PLATFORM software.
The Light Converse team would like to thank all attendees who visited and look forward to meeting again next year.
For further info please visit:
Find us on Facebook.