The ImmerGO system comprises a server connected to an Ethernet AVB network, and a mobile device for control. The server is typically an Apple Mac Mini or an Apple Macbook. The mobile device can be any device that is capable of running an Internet browser. The mobile device allows a user to control the three dimensional (3D) position of sound sources. The sound sources could be tracks of a Digital Audio Workstation (DAW) running on the server, or channels of audio transmitted by any other audio device on the network (sound synthesizer, mixing console, etc). When the sound sources are derived from a DAW running on the server, the mobile device can:
- Control the transport of the DAW
- Select tracks to be localized in 3D space
- Record and playback the localization movements
On a mobile device, a user can control the horizontal position of a sound source by moving it via the touch screen or by 'swivelling' the device. The vertical position of the sound can be determined by tilting the mobile device up and down. This use of the mobile device's orientation capabilities allows for intuitive control over the 3D positioning of sound sources.
3D coordinates are transmitted by the mobile device to the server. The server uses a sound localization algorithm to determine mix levels for each of the speakers, and transmits these mix levels over the Ethernet AVB network. When recording sound source movements, the server reads MIDI Time Code (MTC) from the DAW, and uses the MTC to time stamp the 3D coordinates. On playback, the MTC time stamps are used to determine the transmission times of new sound source positions.
1. What you will need
The system has three components:
- Mobile devices for user interaction
- A server for translating the user localization requests into commands that implement the requests
- An Ethernet AVB network with AVB devices (speakers/amplifiers) that respond to server commands
An easy to use mobile interface allows a user to control the transport of a Digital Audio Workstation (DAW) on the remote server and to select tracks for localization.
It is possible for the Immergo system to control the localization of sound sources that emanate from other devices that have Ethernet AVB interfaces. Example devices are mixing consoles and sound synthesizers. The track button selections in this case would correspond to selections of audio channels transmitted by the devices.
Using a mobile device to control localization provides a number of advantages:
There is no restriction on the sound engineer's physical location. The sound engineer can move to various positions in the audience space and test the sound localization at these positions.
For full 3D localization, the sound engineer can use the touch screen for horizontal localization, while at the same time tilting the mobile device for vertical localization.
Indeed all localization can be performed by tilting the mobile device in three planes. Orientation control is selectable via three check boxes.
4. Soundscape for later playback
If a soundscape is to be created for later playback, then a typical sequence of operation would be:
- Choose a track to localize from the track buttons on the mobile device interface
- Request recording via the Record button
- Ask the DAW to play by clicking the Play button
- Move the sound of the selected track in 3D space. This can be done by:
- Using the touchscreen to move the sound in the horizontal plane
- Tilting the mobile device to move the sound vertically
- Stop the recording, then proceed to further tracks
Time code is displayed on the mobile device during playback and recording, so that it is easy to re-position and re-record.
During the recording of user controlled sound localization, 3D coordinates are sent from the mobile device to the server. The server time stamps these 3D coordinates using MIDI Time Code (MTC) from the DAW. These track recordings can played back, and can be saved in a number of file formats.