Theater Control

An Immergo system can be used to localize sound sources in theater productions and theme parks. The location of sounds can be controlled as the show progresses, or soundscapes can be recorded and played back during a show. 

 

1. What you will need

The system has three components:

  • Mobile devices for user interaction 
  • A server for translating the user localization requests into commands that implement the requests 
  • An Ethernet AVB network with AVB devices (speakers/amplifiers) that respond to server commands

 

 

 

2. Localization control

An easy to use mobile interface allows a user to control the transport of a Digital Audio Workstation (DAW) on the remote server and to select tracks for localization.

It is possible for the Immergo system to control the localization of sound sources that emanate from other devices that have Ethernet AVB interfaces. Example devices are mixing consoles and sound synthesizers. The track button selections in this case would correspond to selections of audio channels transmitted by the devices.

 

3. Advantages

Using a mobile device to control localization provides a number of advantages:

  • There is no restriction on the sound engineer's physical location. The sound engineer can move to various positions in the audience space and test the sound localization at these positions.

  • For full 3D localization, the sound engineer can use the touch screen for horizontal localization, while at the same time tilting the mobile device for vertical localization.

  • Indeed all localization can be performed by tilting the mobile device in three planes. Orientation control is selectable via three check boxes. 

4. Soundscape for later playback

If a soundscape is to be created for later playback, then a typical sequence of operation would be:

  • Choose a track to localize from the track buttons on the mobile device interface
  • Request recording via the Record button
  • Ask the DAW to play by clicking the Play button
  • Move the sound of the selected track in 3D space. This can be done by:
    • Using the touchscreen to move the sound in the horizontal plane
    • Tilting the mobile device to move the sound vertically
  • Stop the recording, then proceed to further tracks

Time code is displayed on the mobile device during playback and recording, so that it is easy to re-position and re-record.

During the recording of user controlled sound localization, 3D coordinates are sent from the mobile device to the server. The server time stamps these 3D coordinates using MIDI Time Code (MTC) from the DAW. These track recordings can played back, and can be saved in a number of file formats.