Matthew Nielson's sound design draws attention to itself for Round House Theatre's A Wrinkle In Time
Yamaha's 02R sound console still sat in the back of the house but managed to take center stage anyway when Bethesda, Maryland's Round House Theatre staged an adaptation of Madeleine L'Engle's space-warp novel, A Wrinkle in Time. The story of the 1962 novel seems something like a precursor to the Harry Potter phenomenon, a blend of Star Wars and adventure stories about and for young boys and girls like The Hardy Boys and Nancy Drew novels. Its stage adaptation called for characters with unique vocal characteristics that could only be delivered with the help of the 02R's onboard effects processor.
The play presented the theatre's resident sound designer, Matthew M. Nielson, with an unusual opportunity. “Most of the time, my job is to augment or sweeten sounds as subtly as possible so the audience hears what they need to hear without noticing the technology,” he says. “Here, I got to make audio manipulation a feature in the play.”
Two of the characters in the play—“Mrs. Which” and “Mrs. Who” played by actors who doubled in other roles—needed to be vocally distinctive. The most distinctive of all had to be Mrs. Which, actually played by a metallic sphere in the hand of the actor KenYatta Rogers who would speak the lines. “The script calls for a sphere of light but doesn’t talk about the quality of the voice. We made that up,” says Nielson. “KenYatta wears a standard wireless mic of course, and it goes into the Yamaha 02R digital sound board. The board has a built-in effects processor that allows two different settings per microphone. So we had a standard amplification setting for him for his other scenes and a second setting for Mrs. Which, which was a slight delay plus a pitch shift upward and the ‘chorus reverb’ effect from the board’s SFX library.”
The other character for which a live digital manipulation effect was used was Mrs. Who played by Dawn Urula. “Dawn’s Mrs. Who was done the same way, but her pitch shift was downward, and we used a different reverb effect from the library,” Nielson says. “We used the ‘flange reverb’ for that. We also blended in some of her own natural voice to give the effect more body. She has a wonderfully rich, full voice, and it would have been a shame to isolate it out of existence.”
There were other major voice effects as well, but these were the two that involved realtime manipulation. “There were a couple of other times that we were tempted to use realtime manipulation, but it came down to not wanting to have too many effects,” Nielson says. “We didn't want to overdo it.”
Actress Tanya Beckman Ross had the character of Man with Red Eyes, which she played draped in a floor-length cowl with lighted eyes at face level. “Her mouth wasn't visible for that role,” says Nielson, “so she didn't do her line readings live. Instead, she was silent while we played back previously recorded and manipulated tracks of her voice.”
One role was played at different performances by one of two young actors. For most of the show the character’s voice was given normal amplification, but for a big scene when the character confronts an extra-terrestrial intelligence called “IT,” his voice needed to take on a special, nearly spectral quality. “For the big IT effect when the boy is under the control of the evil force, we prerecorded the lines read by both of them in unison and manipulated it for playback during the show,” says Nielson. For the voice of that evil force, Nielsen recorded Urula, Rogers, and another actress from the cast, Tonya Beckman Ross, speaking the lines in unison and then applied a reverb effect to the blend. JJ Kaczynski's projections on scenic designer Misha Kachman’s backdrop gave a visual form to the voice.
The show also gave Nielsen “a great opportunity to try out new effects with Antares’ vocal modeling harmony generator, the Harmony Engine. The software is usually mainly for backup vocals for singers. Here, I could use all four voice channels, patch the same voice into each one, vary it slightly, and then recombine them.”
Nielson enjoyed the opportunities presented by the show. “I’ve done effects over voice before, but most often, it is to create an improvement in the voice reaching the audience that isn’t supposed to be noticeable, rather than creating an obvious special effect,” he says. This time out, sound design took center stage.