By Eduardo Patricio
In this post we present two videos in different formats, but edited from the same source material captured on the 20th of June 2018, at Barigui park (Curitiba, Brazil).
The audio was recorded with the ZYLIA ZM-1 3rd order Ambisonics spherical microphone array while the video was captured by a 360-degree camera (Gear 360).
Below, you can watch both videos and find some information on how to achieve the two different results, with focus on preparing the audio recorded with the ZM-1 microphone for each scenario.
Interactive, immersive video with full 3D sound
(media components: 360-degree video + Ambisonics audio)
Non-interactive video with fixed perspective 3D sound
(media components: Tiny planet” video + binaural audio)
The microphone and the camera were placed on a single camera stand with a small clamped extension arm (see picture below). Both devices were aligned vertically with a small horizontal offset. We made sure the microphone and the camera always had the same relative facing direction (front of the microphone aligned with the camera side where the recording button is found).
For scenario B, we used the video from Gear 360 in ‘tiny planet’ format and a binaural audio track.
Since, the source material is the same as the one from scenario A, we’ll list here only the steps that differ.
Scenario B steps:
Choosing binaural preset on ZYLIA Studio PRO in REAPER
#ambiencerecording #ambisonics #binaural #soundscapes #immersiveaudio #360recording
Zylia: Tell us what is the story behind your project?
Yao: Last spring, I was 9 months away from graduating from Berklee College of Music, and the panic of post-graduation uncertainty was becoming unbearable. I was struggling to plan my career and I wanted to do something different. I spent a whole summer researching the ins and outs of spatial audio and decided to do my Senior Portfolio Project around my research. What I have found is that spatial audio is often found in VR games and films - recreating a 3D environment. It is rarely used as a tool for music composition and production. I saw my opportunity.
With the help and hard work of my team (around 60 students involved), we succeeded in creating ‘Unraveled’, an immersive 360 audio and visual experience, where the audience would find themselves at the center of all elements, being surrounded by choir, strings, synths and imagery. My role was the project leader, composer, and executive producer. I found a most talented team of friends to work on this together: Gareth Wong and Deniz Turan as co-producers, Carlos Del Castillo as visual designer, Ben Knorr as music contractor, Paden Osburn as music contractor and conductor, Jeffrey Millonig as lead engineer and Sherry Li as lead vocalist and lyricist. Not to mention the wonderful musicians and choir members. I am truly grateful for their hard work, dedication and focus.
‘Unraveled’ also officially kickstarts my company ICTUS, a company that provides music and sound design content specializing in spatial audio solutions. For immersive experiences such as VR, AR and MR, we are your one-stop audio shop for a soundscape that completes the reality. We provide music composition, sound design, 360 recording, mixing, mastering, post-production, spatialization and visualizing services tailored to your unique project.
We are incredibly humbled that 'Unraveled' has been officially selected for the upcoming 44th Seattle International Film Festival, which runs May 17 to June 10, and to have been accepted for the Art and Technology Exhibition at the Boston Cyberarts Gallery, from Saturday May 26 to Sunday July 1.
"Get a pair of headphones. Cave in somewhere quiet. Alone. Empty your thoughts and… Allow yourself to dive into something new. A pristine place that will truly disconnect you from the daily frenzy of life."
‘Unraveled’ has been officially selected for the upcoming 44th Seattle International Film Festival, which runs May 17 - June 10, with more than 400 films from 80 countries, running 25 days, and with over 155,000 attendees!
Zylia: Recording so many people at once must have been challenging. How did you organize this?
Yao: I worked very closely with Paden Osburn, the conductor and music contractor, to schedule, revise, coordinate and plan the session. Paden is a dear to work with, basically allowing me to focus on the music while she coordinated with the rest of the amazing choir members. We had developed a great workflow.
I also had many meetings with the team of engineers as well as many professors to figure out the simplest, most efficient way to record. It was indeed very challenging and stressful to pull off, but it was also one the most magical night of my life.
Behind the scene, photo by @jamiexu0528.
Zylia: Tell us more about the technical part of this project.
Yao: On October 27, 2017, we had a recording session of the choir parts with 40 students from Berklee College of Music. The recording was done using three ambisonic microphones (Zylia, Ambeo, TetraMic). We tried forging a 320 piece choir by asking the 40 students to shift their positions around the microphones for every overdub. We also recorded 12 close mic-ed singers to have some freedom spatializing individual mono sources.
The spatialization was achieved through Facebook360 Spatial Workstation in REAPER. Many sound design elements were created in Ableton and REAPER. The visuals were done in Unity. We basically created a VR game and recorded a 360 video of the performance. Carlos Del Castillo did an outstanding job creating an abstract world that had many moments syncing with the musical cues.
Zylia: What are your plans for future? Any interesting projects on your mind?
Yao: My long-term goal would be to establish my company ICTUS as one of the leading experts in the field of spatial audio. We are currently working on an interactive VR music experience called ‘Flow’ with an ethnic ensemble, GAIA, and the visuals are influenced by Chinese water paintings. The organic nature of this project will be a nice contrast to ‘Unraveled’s futuristic space vibe.
Another segment of the company is focused on creating high quality, cinematic spatial audio for VR films and games. We are producing a 3D audio series featuring short horror/thriller stories with music, descriptive narration, dialogues, SFX and soundscapes. Empathy is truly at the heart of this project, some of our stories will have a humanitarian purpose and we will be associated with many organizations that are fighting to end domestic abuse, human trafficking, rape, abuse and other violent crimes. We hope to bring more awareness and traffic to these causes with our art. Spatial audio is incredibly powerful, it really allows you to be in the shoe of the victims and without the visuals, I swear your imagination will go crazy!
We are happy to announce a new version of ZYLIA Ambisonics Converter. We introduced a few changes based on your input and suggestions.
We added a batch processing. Now, it is possible to process multiple 19 channels wave files within a single session.
There are also quality improvements and bug fixes for 2nd and 3rd order HOA. This update significantly increases the perceptual effect of rotation in HOA domain as well as corrects spatial resolution for 2nd and 3rd order. It is recommended to update to this new version.
By Jakub Zamojski & Lukasz Januszkiewicz
Recording and mixing surround sound becomes more and more popular. Among the popular multichannel representation of surround sound systems like 5.1, 7.1 or cinematic 22.2, especially worthy of note is an Ambisonics format, which is a full-sphere spatial audio technique allowing to get a real immersive experience of 3D sound. You can find more details about Ambisonics here (What is the Ambisonics format?).
Our previous blog post “2nd order Ambisonics Demo VR” described the process of combining audio and the corresponding 360 video into fine 360 movie on Facebook. Presented approach assumes using of 8-channel TBE signal from ZYLIA Ambisonics Converter and converts audio into the Ambisonics domain. As a result we get a nice 3D sound image which is rotating and adapting together with the virtual movement of our position. However, it is still not possible to adjust parameters (gain, EQ correction, etc.) or change the relative position of the individual sound sources present in the recorded sound scene.
In this tutorial we are going to introduce another approach of using ZYLIA ZM-1 to create a 3D sound recording, which gives much more flexibility in sound source manipulation. It allows us not only to adjust the position of instruments in recorded 3D space around ZYLIA microphone, but also to control the gain or to apply any additional effects (EQ, Comp, etc.). In this way we are able to create a fancy spatial mix using only one microphone instead of several spot mics!
Spatial Encoding of Sound Sources – Tutorial
In the end of July 2017, using ZYLIA ZM-1 microphone we have recorded a band called “Trelotechnika”. All band members were located around ZM-1 microphone, 4 musicians and one additional sound source – drums (played from a loudspeaker). During the post-production process, we applied ZYLIA Studio PRO VST plug-in (within Reaper DAW) on recorded 19-channel audio track. This allowed us to separate the previously recorded instruments and transfer them into the individual tracks in the DAW. Those tracks were then directed to the FB360 plug-ins, where encoding to the Ambisonics domain was performed.
“Spatial Encoding of Sound Sources” - a step-by-step description
Below, you will find a detailed description of how to run a demo session presenting our approach of recording and spatial encoding of sound sources. Demo works on Mac Os X and Windows.
After opening the session, you will see several tracks:
3. Separated signals from ZYLIA Studio PRO are passing to 5 individual tracks. You are able to adjust the gain, you can also mute or solo instruments, or you can apply some audio effects. A good practice is to use a high-pass filter for non-bass and low-pass for bass instruments to reduce a spill between them. We applied these filters to our session:
4. Spatialiser track – receives 5 signals from tracks with separated instruments. Spatialiser allows to distribute sound sources in desired positions in the 3D space.
a) Click on FX and choose FB360 Spatialiser.
d) Back to Spatialiser view. You will see an equirectangular picture and five circles with numbers. Each circle represents a sound source position in the space. By default, sources are located in the positions corresponding to the real positions of the instruments in the picture, but it is possible to adjust it by clicking on the circle and dragging it around the picture.
6. Now video is synchronized with audio. Adjusting the location of play-head in REAPER’s time line will affect the video’s time. Tap space bar to play audio and video. Rotation of the video in the player is tracked by the decoded and binauralized Ambisonics sound.
7. A good practice is to play video from the beginning of file to keep the synchronization. In some cases, it is necessary to close the VideoClient + VideoPlayer and load 360 video again to recover the synchronization.
8. Now you are able to rotate video across the pitch and yaw axis. Your demo is ready to run.
By Łukasz Januszkiewicz
Step two - On The Stage
- Microphone placement.
The main ZM-1 microphone was positioned approximately 20 cm below the camera.
Backup ZM-1’s were positioned on the right and on the left side of the main ZM-1 (distance of 4-5 meters). There was also one mic just right in front of the choir.
- The final setup
Step Three – Recording our perfect 360 movie
Step Four - Video And Audio Post-processing
*At this moment YouTube 360 supports only 1st order Ambisonics audio.
360 movie with 3D sound - The final effect
Video will be available soon.
How to turn a room into a recording studio?