Dolby Digital Plus? Surround decoders? How does this all work?

QuadraphonicQuad

Help Support QuadraphonicQuad:

This site may earn a commission from merchant affiliate links, including eBay, Amazon, and others.

key_wiz

1K Club - QQ Shooting Star
QQ Supporter
Joined
Jun 30, 2014
Messages
1,910
Location
Northern Nevada
I downloaded a trial subscription of CBS All Access in order to watch the Super Bowl and I hear that the signal is only in stereo. So I look it up on my AVR and it says the format is Dolby Digital Plus and the output is 2.1.

So I engaged the “Dolby Surround” mode on my Yamaha AVR and now it’s in 5.1 and sounds like football games usually sound on my system.

My question is: what is this format and what is the AVRs “decoder” doing with it? Is it doing the same thing it would do with any stereo TV signal? Or is this something different because it’s “Dolby Digital Plus”? Is this an encoded surround signal they send over a two-channel stream?
 
Last edited:
I haven't experienced what keywhiz mentioned above but I'd sure like to know what the explanation for it is.

It'd be great if the broadcasters were more forthcoming with this sort of info. Definitely wouldn't interest everyone but it could be easily communicated via paragraph/bullet-pointed list on their website and then everyone who's interested knows what's what, and those that don't care can just not read it.
 
I saw that surround mode pop up on my brothers system while watching TV (Dolby Digital +). I didn't know what it was either. The system was already outputting 5.1, and when he asked me about it, my reply was "Im not sure. Maybe a higher resolution version of Dolby Digital?"

I wasn't too far off. See here.
 
I saw that surround mode pop up on my brothers system while watching TV (Dolby Digital +). I didn't know what it was either. The system was already outputting 5.1, and when he asked me about it, my reply was "Im not sure. Maybe a higher resolution version of Dolby Digital?"

I wasn't too far off. See here.
Great find, LuvMyQuad!

So, it's metadata: https://www.dolby.com/us/en/technologies/dolby-metadata.html and less info than Dolby TrueHD (to be expected, I guess): https://www.dolby.com/us/en/technologies/dolby-digital-plus.html.

Dolby: Dig it, Al.
 
I saw that surround mode pop up on my brothers system while watching TV (Dolby Digital +). I didn't know what it was either. The system was already outputting 5.1, and when he asked me about it, my reply was "Im not sure. Maybe a higher resolution version of Dolby Digital?"

I wasn't too far off. See here.

Thanks for the link! This is what I suspected (hoped)---that it's an encoded surround signal that the "surround" decoder unlocks that is a true 5.1 signal as opposed to what that same program does with a regular stereo signal.

At least that's what it's supposed to be doing, right?
 
BTW, funny this thread got moved over here. I wouldn’t have thought of putting it here because it’s not a DISC based format question but one more about the software in my AVR.
 
BTW, funny this thread got moved over here. I wouldn’t have thought of putting it here because it’s not a DISC based format question but one more about the software in my AVR.

cheese.png


Not me!
 
Thanks for the link! This is what I suspected (hoped)---that it's an encoded surround signal that the "surround" decoder unlocks that is a true 5.1 signal as opposed to what that same program does with a regular stereo signal.

At least that's what it's supposed to be doing, right?

At first glance the behavior seems odd. The page you linked to is full of marketing speak, practically useless. The fact that the OP's AVR natively 'knows' that the signal is 'Dolby Digital Plus', but the output is '2.1', makes me wonder why the AVR didn't automatically output it as DD 5.1. Why did he have to 'enforce' Dolby surround decoding manually?

The answer might be here:

"Much consumer gear, and even some professional gear, does not recognize Dolby Digital Plus as an encoded format, and will treat DD+ signals over a S/PDIF or similar interface, or stored in a .WAV file or similar container format, as though they were linear PCM data. This is not problematic if the data is passed unchanged, but any gain scaling or sample rate conversion, operations which are aurally harmless to PCM data, will corrupt and destroy a Dolby Digital Plus stream. (Older codecs such as DTS or AC-3 are more likely to be recognized as compressed formats and protected from such processing). "

https://en.wikipedia.org/wiki/Dolby_Digital_Plus#Physical_transport_for_consumer_devices

So, the OP's AVR thinks it's getting PCM (fortunately unaltered*) -- and 2.1 channel PCM at that? -- rather than something that needs to be decoded? It makes me wonder if the OP is hearing 'real' DD+ 5.1, or if turning on Dolby Surround is just upmixing a signal that the AVR thinks is stereo. I'd have to ponder how this could even be tested. I think it's an encouraging sign that the codec is recognized as DD+. What's strange is there's no indication in literature that the 'default' would be 2.1 if the data is perceived as PCM.


*I take the tech note to mean that if the DD+ bitstream was altered in any way before it hits the AVR, the result would be white noise (destroyed) -- just as happens with DD or DTS files if they aren't passed bit-perfectly.
 
At first glance the behavior seems odd. The page you linked to is full of marketing speak, practically useless. The fact that the OP's AVR natively 'knows' that the signal is 'Dolby Digital Plus', but the output is '2.1', makes me wonder why the AVR didn't automatically output it as DD 5.1. Why did he have to 'enforce' Dolby surround decoding manually?

The answer might be here:

"Much consumer gear, and even some professional gear, does not recognize Dolby Digital Plus as an encoded format, and will treat DD+ signals over a S/PDIF or similar interface, or stored in a .WAV file or similar container format, as though they were linear PCM data. This is not problematic if the data is passed unchanged, but any gain scaling or sample rate conversion, operations which are aurally harmless to PCM data, will corrupt and destroy a Dolby Digital Plus stream. (Older codecs such as DTS or AC-3 are more likely to be recognized as compressed formats and protected from such processing). "

https://en.wikipedia.org/wiki/Dolby_Digital_Plus#Physical_transport_for_consumer_devices

So, the OP's AVR thinks it's getting PCM (fortunately unaltered*) -- and 2.1 channel PCM at that? -- rather than something that needs to be decoded? It makes me wonder if the OP is hearing 'real' DD+ 5.1, or if turning on Dolby Surround is just upmixing a signal that the AVR thinks is stereo. I'd have to ponder how this could even be tested. I think it's an encouraging sign that the codec is recognized as DD+. What's strange is there's no indication in literature that the 'default' would be 2.1 if the data is perceived as PCM.


*I take the tech note to mean that if the DD+ bitstream was altered in any way before it hits the AVR, the result would be white noise (destroyed) -- just as happens with DD or DTS files if they aren't passed bit-perfectly.

Yes. This is my primary question. Am I actually decoding a 5.1 signal or simply upmixing a stereo signal? I hadn’t thought about it, but I agree it’s odd that since it recognizes the signal as “Dolby Digital Plus” that it doesn’t automatically engage the decoder and that I have to do it manually.

But that might just a be a quirk of the male/model of AVR?
 
Back
Top