Not sure if you have seen, but iOS16 will allow you to scan your head and ears using a truedepth camera on your iPhone/iPad and get a better binaural experience with Dolby Atmos track in Apple Music.
https://www.techradar.com/news/apple-is-adding-personalized-audio-to-airpods-by-scanning-your-ears
I think this is why the recommendation is to mix an album in a 7.1.4 studio, because the binaural rendering will constantly change and improve till it feels you are in a 7.1.4 room.
Still this is rendered using the Spatial Audio renderer and not the Dolby Atmos renderer, the binaural experience between the two is different, and I welcome some competition to improve quickly the end user experience.
I also hope, this will be made available in Logic Pro soon.
What's your take?
https://www.techradar.com/news/apple-is-adding-personalized-audio-to-airpods-by-scanning-your-ears
I think this is why the recommendation is to mix an album in a 7.1.4 studio, because the binaural rendering will constantly change and improve till it feels you are in a 7.1.4 room.
Still this is rendered using the Spatial Audio renderer and not the Dolby Atmos renderer, the binaural experience between the two is different, and I welcome some competition to improve quickly the end user experience.
I also hope, this will be made available in Logic Pro soon.
What's your take?