1. What you will need
The system has three components:
- Mobile devices for user interaction
- A server for translating the user localization requests into commands that implement the requests
- An Ethernet AVB network with AVB devices (speakers/amplifiers) that respond to server commands
It is possible for the Immergo system to control the localization of sound sources that emanate from various hardware and software devices, for example:
- Mixing consols
- Digital synthesizers
- Digital Audio Workstations (DAW's)
2. Localization control
An easy to use mobile interface allows a sound engineer to control the localization of the sound sources that are transmitted by these various device types. Buttons on the interface allow the engineer to select particular audio channels and localize the sound from these channels. If the sound source is a DAW, then the transport of the DAW can be controlled from the mobile device.
Using a mobile device to control localization provides a number of advantages:
- There is no restriction on the sound engineer's physical location.
- The sound engineer can move to various positions in the audience space and test the sound localization at these positions.
- For full 3D localization, the sound engineer can use the touch screen for horizontal localization, while at the same time tilting the mobile device for vertical localization.Indeed all localization can be performed by tilting the mobile device in three planes. Orientation control is selectable via three check boxes.
4. Soundscape for later playback
If a soundscape is to be created for later playback, then a typical sequence of operation would be:
- Choose a track to localize from the track buttons on the mobile device interface
- Request recording via the Record button
- Ask the DAW to play by clicking the Play button
- Move the sound of the selected track in 3D space. This can be done by:
- Using the touchscreen to move the sound in the horizontal plane
- Tilting the mobile device to move the sound vertically
- Stop the recording, then proceed to further tracks
Time code is displayed on the mobile device during playback and recording, so that it is easy to re-position and re-record.
During the recording of user controlled sound localization, 3D coordinates are sent from the mobile device to the server. The server time stamps these 3D coordinates using MIDI Time Code (MTC) from the DAW. These track recordings can played back, and can be saved in a number of file formats.