Listening to in Dolby Atmos Streaming, via Tidal/Apple/Amazon

QuadraphonicQuad

Help Support QuadraphonicQuad:

This site may earn a commission from merchant affiliate links, including eBay, Amazon, and others.
https://music.apple.com/gb/album/the-tears-of-hercules/1585386908
can the rest of Rod's catalogue in Atmos be very far away..? 🤔 i kinda wondered if Tonight's The Night was just a Surround taster.. 💡 we shall see.. 👀

Edit: i'm thinking of the Atlantic Crossing and beyond years, on Warner, not his earlier UMG/Mercury stuff

https://music.apple.com/gb/album/tonights-the-night-gonna-be-alright/320268825?i=320268864
This new one sounds good, but I just can’t listen to these mediocre, soft songs….
 
This new one sounds good, but I just can’t listen to these mediocre, soft songs….

i thought the Atmos mix was good, with cool stuff in the Rears and Heights etc, maybe a bit low on Bass at times idk?

some of the melodies felt a bit 'by the numbers' and his voice at times was a bit tricky to truly enjoy as he often sounded somewhat in difficulty but its a nice surprise to see it in Atmos and hopefully means Warners might be of a mind to revisit some of his earlier work in Spatial as well🤞
 
Hello everyone, seems like a great community here, happy to join.
Im a mix engineer who is just starting getting into atmos mixing. In my studio i have a 9.2.4 setup and am able to stream via 4K Apple TV into a tonewinner At300 receiver going into Avid Matrix router to my speakers. When listening to ADM files straight from the dolby renderer, all mixes sound pristine in all channels, however ive noticed when streaming from Apple and Tidal when I solo certain surround channels I get a sort of garbled digital artifacty sound. Similar to how a low res mp3 sounds. I notice it the most in the heights but its sometimes bad in the surrounds for certain songs. usually the L, R and C channels sound clean. Ive tried using the amazon fire cube as well and different HDMI cables and have the exact same result. My internet connection is very strong. I have an ADM file of a mix that is also on streaming so I was able to directly compare and the difference is a pretty dramatic loss in quality. Im wondering if anyone else has noticed this or has any ideas of what is causing it? I have heard that this could be related to the Dolby Digital plus JOC encoding that happens when these files are streaming.
 
Last edited:
Hello everyone, seems like a great community here, happy to join.
Im a mix engineer who is just starting getting into atmos mixing. In my studio i have a 9.2.4 setup and am able to stream via 4K Apple TV into a tonewinner At300 receiver going into Avid Matrix router to my speakers. When listening to ADM files straight from the dolby renderer, all mixes sound pristine in all channels, however ive noticed when streaming from Apple and Tidal when I solo certain surround channels I get a sort of garbled digital artifacty sound. Similar to how a low res mp3 sounds. I notice it the most in the heights but its sometimes bad in the surrounds for certain songs. usually the L, R and C channels sound clean. Ive tried using the amazon fire cube as well and different HDMI cables and have the exact same result. My internet connection is very strong. I have an ADM file of a mix that is also on streaming so I was able to directly compare and the difference is a pretty dramatic loss in quality. Im wondering if anyone else has noticed this or has any ideas of what is causing it? I have heard that this could be related to the Dolby Digital plus JOC encoding that happens when these files are streaming.
The answer might be as simple as comparing uncompressed music to compressed music.
 
The answer might be as simple as comparing uncompressed music to compressed music.

Yeah I actually just read something stating that atmos streams at 768kbps bitrate and the most data is reserved for LCR and Ls/Rs so that would explain why the heights and other surrounds sound crappy. Thats a shame, I hope that will improve soon as it really is a huge difference in quality and this will hold back the format.
 
Yeah I actually just read something stating that atmos streams at 768kbps bitrate and the most data is reserved for LCR and Ls/Rs so that would explain why the heights and other surrounds sound crappy. Thats a shame, I hope that will improve soon as it really is a huge difference in quality and this will hold back the format.
Yes great post and I am the only one noticing the drop in sound quality, this really should be apple’s priority to fix.
 
Hello everyone, seems like a great community here, happy to join.
Im a mix engineer who is just starting getting into atmos mixing. In my studio i have a 9.2.4 setup and am able to stream via 4K Apple TV into a tonewinner At300 receiver going into Avid Matrix router to my speakers. When listening to ADM files straight from the dolby renderer, all mixes sound pristine in all channels, however ive noticed when streaming from Apple and Tidal when I solo certain surround channels I get a sort of garbled digital artifacty sound. Similar to how a low res mp3 sounds. I notice it the most in the heights but its sometimes bad in the surrounds for certain songs. usually the L, R and C channels sound clean. Ive tried using the amazon fire cube as well and different HDMI cables and have the exact same result. My internet connection is very strong. I have an ADM file of a mix that is also on streaming so I was able to directly compare and the difference is a pretty dramatic loss in quality. Im wondering if anyone else has noticed this or has any ideas of what is causing it? I have heard that this could be related to the Dolby Digital plus JOC encoding that happens when these files are streaming.
Hi, That sounds nice!
 
Yes great post and I am the only one noticing the drop in sound quality, this really should be apple’s priority to fix.
TIDAL use 768 kb/s bit rate for the DD+ with Atmos streaming. So, no possible improvement here, while still using DD+. I asume this would be the max average bit rate allowed by Dolby codec.
I do not know what Apple Music use.

There are audio quality differences between lossy (streaming DD+ JOC) and lossless (Dolby TrueHD with Atmos on Blu-ray).

The differences are clearly noticed by many of us that compare an Atmos streaming album with the corresponding Blu-Ray edition.

I wonder, how much this audible difference is due to the codec format (lossy vs lossless), and how much could be due to a different mastering process to let the Hi-res edition clearly sound better.

With respect to the use of DD+ in streaming services, I have read somewhere that the choice is a technical constraint because the High bit rate could not be adequately transferred through the available bandwidth of the average Internet user.

I do not quite understand this because:
  • The bitrate of a Dolby TrueHD with Atmos is 640/760 kb/s, variable rate with a maximum peak of about 7 or 8 Mb/s, according to MediaInfo in some Blu-rays of my own. The Netflix user for UHD needs up to 20Mb/s to watch it at 4K.
  • If a user internet link drops below that bandwith, it is normal that the video streaming playing device buffers or switch to lower resolution (HD or less) to cope with the lower bitrate. We notice a temporary degradation of the video quality that improves automatically later. I notice much of this with Amazon prime video.
  • BUT, what about, music streaming with Dolby TrueHD Atmos? Possibly the adaptive decoders to lower the resolution/sound quality on the fly does not exist.
I wonder if this is currently a technical constraint or just another market segmentation to sell the Hi-res better quality Blu-ray apart from the “normal quality” streaming.
 
TIDAL use 768 kb/s bit rate for the DD+ with Atmos streaming. So, no possible improvement here, while still using DD+. I asume this would be the max average bit rate allowed by Dolby codec.
I do not know what Apple Music use.

There are audio quality differences between lossy (streaming DD+ JOC) and lossless (Dolby TrueHD with Atmos on Blu-ray).

The differences are clearly noticed by many of us that compare an Atmos streaming album with the corresponding Blu-Ray edition.

I wonder, how much this audible difference is due to the codec format (lossy vs lossless), and how much could be due to a different mastering process to let the Hi-res edition clearly sound better.

With respect to the use of DD+ in streaming services, I have read somewhere that the choice is a technical constraint because the High bit rate could not be adequately transferred through the available bandwidth of the average Internet user.

I do not quite understand this because:
  • The bitrate of a Dolby TrueHD with Atmos is 640/760 kb/s, variable rate with a maximum peak of about 7 or 8 Mb/s, according to MediaInfo in some Blu-rays of my own. The Netflix user for UHD needs up to 20Mb/s to watch it at 4K.
  • If a user internet link drops below that bandwith, it is normal that the video streaming playing device buffers or switch to lower resolution (HD or less) to cope with the lower bitrate. We notice a temporary degradation of the video quality that improves automatically later. I notice much of this with Amazon prime video.
  • BUT, what about, music streaming with Dolby TrueHD Atmos? Possibly the adaptive decoders to lower the resolution/sound quality on the fly does not exist.
I wonder if this is currently a technical constraint or just another market segmentation to sell the Hi-res better quality Blu-ray apart from the “normal quality” streaming.

I posted about all this a couple of years ago here, here and here but the short version is the current DD+ bitrate Tidal and Apple use is nowhere near the maximum nor is the argument valid that the average home internet connection wouldn't support higher.
 
I posted about all this a couple of years ago here, here and here but the short version is the current DD+ bitrate Tidal and Apple use is nowhere near the maximum nor is the argument valid that the average home internet connection wouldn't support higher.

i can't remember where i heard it now but Netflix supposedly have a setup with multiple different quality versions of each show or movie and the App detects the end user's internet connection speed and sends the appropriate quality level stream accordingly. idk how accurate that is but if its doable for Atmos music streaming maybe Apple/Tidal could try it?
 
i can't remember where i heard it now but Netflix supposedly have a setup with multiple different quality versions of each show or movie and the App detects the end user's internet connection speed and sends the appropriate quality level stream accordingly. idk how accurate that is but if its doable for Atmos music streaming maybe Apple/Tidal could try it?

Whatever technology Neil Young's website uses to adjust audio quality based on the strength of your internet speed works great, I would love to see that adapted to the big streaming services.

Are there any other services that allow Atmos streaming through Apple TV? And if so, are they any better? I know Amazon streams Atmos but it doesnt seem you can do it though the app (wtf??)
 
Last edited:
TIDAL use 768 kb/s bit rate for the DD+ with Atmos streaming. So, no possible improvement here, while still using DD+. I asume this would be the max average bit rate allowed by Dolby codec.
I do not know what Apple Music use.

There are audio quality differences between lossy (streaming DD+ JOC) and lossless (Dolby TrueHD with Atmos on Blu-ray).

The differences are clearly noticed by many of us that compare an Atmos streaming album with the corresponding Blu-Ray edition.

I wonder, how much this audible difference is due to the codec format (lossy vs lossless), and how much could be due to a different mastering process to let the Hi-res edition clearly sound better.

With respect to the use of DD+ in streaming services, I have read somewhere that the choice is a technical constraint because the High bit rate could not be adequately transferred through the available bandwidth of the average Internet user.

I do not quite understand this because:
  • The bitrate of a Dolby TrueHD with Atmos is 640/760 kb/s, variable rate with a maximum peak of about 7 or 8 Mb/s, according to MediaInfo in some Blu-rays of my own. The Netflix user for UHD needs up to 20Mb/s to watch it at 4K.
  • If a user internet link drops below that bandwith, it is normal that the video streaming playing device buffers or switch to lower resolution (HD or less) to cope with the lower bitrate. We notice a temporary degradation of the video quality that improves automatically later. I notice much of this with Amazon prime video.
  • BUT, what about, music streaming with Dolby TrueHD Atmos? Possibly the adaptive decoders to lower the resolution/sound quality on the fly does not exist.
I wonder if this is currently a technical constraint or just another market segmentation to sell the Hi-res better quality Blu-ray apart from the “normal quality” streaming.
FWIW, Apple is the same - DD+ @ 768kbs - except the ATV4K converts it to 7.1 24/48 PCM (with embedded Atmos metadata) for transmission to an AVR.

My guess is Apple does this for 2 reasons.

One, many people - even many on this forum - already get completely confused with the process. Providing more options - even seemingly idiot proof options - will only make that worse.

Two...control. More options means less control.
 
I posted about all this a couple of years ago........ and here but the short version is the current DD+ bitrate Tidal and Apple use is nowhere near the maximum nor is the argument valid that the average home internet connection wouldn't support higher.
A little confused about your explanation on the link above.

A bit rate for the Audio track of 6.500 kb/s means 6.500 kilo bits/sec, thus 6.5 Mb/s for internet connection (not accounting for overhead).
But you say the internet connection needed is ~52 Mbps , as if the audio track bit rate was expressed in Bytes not in bits.

When I read MediaInfo "Maximum bit rate: 7.500 kb/s", I interpret 7.500 kilo bits per sec. I.e. 7.5 Mbps Internet bandwith.

Am I wrong?
 
FWIW, Apple is the same - DD+ @ 768kbs - except the ATV4K converts it to 7.1 24/48 PCM (with embedded Atmos metadata) for transmission to an AVR.

My guess is Apple does this for 2 reasons.

One, many people - even many on this forum - already get completely confused with the process. Providing more options - even seemingly idiot proof options - will only make that worse.

Two...control. More options means less control.

The primary reason they do it is so they can mix in system/navigation sounds.
 
Some somewhat positive news. New Apple TV, and iOS (and watch and Homepod and Mac, etc etc etc, except those aren’t really relevant).

Some users are reporting resolved issues with Apple Music, such as tracks being randomly skipped, popping between tracks, etc.

This, sadly, doesn’t seem to be the case for Atmos, with its gapless playback still having a slight pop between tracks, but it’s… marginally better for me. A few tracks in Max Richter’s Sleep (my go to test album for this, due to the fact that it’s been chopped into 204 small tracks) do now flow naturally the way you’d expect. Other times it’s still… poppy.

It’s a step forward in the right direction for the service, but I’m assuming the upcoming June WWDC (Worldwide Developers Conference) that Apple usually announces new software/OS updates/etc will have some overhauled Apple Music news, especially with the classical only streaming service Apple bought.
 
A little confused about your explanation on the link above.

A bit rate for the Audio track of 6.500 kb/s means 6.500 kilo bits/sec, thus 6.5 Mb/s for internet connection (not accounting for overhead).
But you say the internet connection needed is ~52 Mbps , as if the audio track bit rate was expressed in Bytes not in bits.

When I read MediaInfo "Maximum bit rate: 7.500 kb/s", I interpret 7.500 kilo bits per sec. I.e. 7.5 Mbps Internet bandwith.

Am I wrong?

You may be right, I'd need to revisit all the figuring. I have a pet peeve about any expression in 'bits per second' with these (relatively) large file and bandwidth numbers, I just find bytes easier to get my mind around. There is some argument that when talking about bandwidth vs file sizes that 'bits' make more sense but I can't remember how it goes and also suspect it's rooted in some reasons from the past that may not make sense anymore.
 
Whatever technology Neil Young's website uses to adjust audio quality based on the strength of your internet speed works great, I would love to see that adapted to the big streaming services.

Absolutely, it works very well and you're right that it would be good to see it used elsewhere. It's possible that Neil paid for it's development and wants to keep it for himself, not sure.

Are there any other services that allow Atmos streaming through Apple TV? And if so, are they any better? I know Amazon streams Atmos but it doesnt seem you can do it though the app (wtf??)

I believe most/all use the same DD+ 768 kbps Atmos audio encoding, ie same for movies too.
 
Back
Top