HDMI Audio Only Inputs & Outputs

QuadraphonicQuad

Help Support QuadraphonicQuad:

This site may earn a commission from merchant affiliate links, including eBay, Amazon, and others.
In the case of the 205, the HDMI-audio output's video signal is black so that the clock rates can be set more appropriately for audio.

I smell snake oil. You can set that same clock rate regardless of what colour or content the video signal has, within reason. It's the video bit rate that matters.
 
I keep my BDP-80 going for the same reason you keep your BDP-93 . Plays SACD-R & BD-R which is important to me. I also keep my 970HD going with a firmware flash that lets me play SACD-R, IF I need it. My old Sammie HD840 plays SACD-R as well but it has DVI instead of HDMI.
THE BDP-103 will play BD-R but of course no SACD-R but I guess you know that anyway. I'm so used to my BDP-80 that I often slip a backup copy of an SACD in the 103 only to have it not play, :cry:

Are you saying the Oppo 203/205 can't play home produced BD-Rs? If so that's outrageous! There is nothing wrong with recording HD video with a camera and writing it to your own BD-Rs, no copyright infringement involved.

SACD-R is the odd one out, because by the very weirdness of the format it is highly likely a BD-R is copyright infringing.

But that doesn't apply to any other format. I expect to be able to record 4K video with a camera and write my own UHD blu rays. Who do these format associations think they are to prevent us doing perfectly legal things with the equipment we paid good money for? And then they wonder why piracy happens, not realising that their restrictions drove us to it.
 
I smell snake oil. You can set that same clock rate regardless of what colour or content the video signal has, within reason. It's the video bit rate that matters.
Exactly. The video clock can run at various rates and, via this output which has only black-screen video, the rate is set to benefit the audio data. I mentioned it in my review with a link to the explanation on Oppo's website:
"One of the UDP-205's two HDMI outputs is labeled Audio Only. For that output, the standard video clock is replaced with a dedicated, high-stability, ultra-low-jitter, 148.5MHz master clock, based on an SAW oscillator, that uses a pattern-generator chip to create full black content. This is intended to eliminate the multiplicity of phase-locked loops (PLLs) to provide the clock frequencies for different video formats, and to eliminate the influence of different video formats on the audio, which is embedded in the video blanking interval. "
 
All my Oppos have both a regular HDMI output and an Audio Only output. My amp (a Sony) has 6 HDMI inputs and one of them is designated 'best' for audio only. The Sony's owners manual says to use that input for "better quality sound" but doesn't go into any detail as to why that particular HDMI input is "better". ...

Are there advantages, and if so, what are they?

Quality-wise, it should be the very same.
Connection-wise, it can make a BIG difference. Let me explain.

Separating the Audio Connection cable/port has no influence on the quality of the signal - it's digital, not analog - but can change a lot about WHAT the hdmi connection negotiates as valid format supported by the hdmi chain.

There's a well-known BUG into lots of Intel chipset that allows as playable audio format the maximum COMMON to the entire hdmi chain. It may be present in other chipset too, don't know.

The problem is, following the logic rule as above, it you have on the same hdmi chain:
- player that can do 8 channel / 192000 Hz / 24 bit
- amplifier that can do 8 channel / 192000 Hz / 24 bit
- screen (tv/projector...) that can do 2 channel / 48000 Hz /16 bit

you end up that you can't play anything more than 2 channel / 48000 Hz /16 bit.

Funny?

Separating audio and video chain allows for max resolution for both audio and video in a independent way, so the audio of the HDMI Video is removed or resampled if needed, and the audio of the HDMI audio can go at full blast regardless of what crappy audio capabilities you may have on the screen.
 
Are you saying the Oppo 203/205 can't play home produced BD-Rs? If so that's outrageous! There is nothing wrong with recording HD video with a camera and writing it to your own BD-Rs, no copyright infringement involved.

SACD-R is the odd one out, because by the very weirdness of the format it is highly likely a BD-R is copyright infringing.

But that doesn't apply to any other format. I expect to be able to record 4K video with a camera and write my own UHD blu rays. Who do these format associations think they are to prevent us doing perfectly legal things with the equipment we paid good money for? And then they wonder why piracy happens, not realising that their restrictions drove us to it.
NO, no, I have no idea what the 203/205 capabilities are. I missed that whole production release I'm afraid.:cry:
But as for BD-R, I will continue to back up any and all optical discs I have.
 
I currently have a Sammie HD840 with the hack I have been using to play SACD-Rs, but it is coming out of the system (as is a Denon DVD and some other components) to make way for the Oppos.
Yes at the time the Sammies weren't bad. I bought the 840 and later a 941 from some shop on the internet a good while back. The 941 was an improvement over the 840 and was HDMI & analog instead of DVI. I made a horrible mistake and gave the 941 to my stepson and I suspect he toted it to the pawnshop as he never had it again. sigh. But I had already bought an Oppo so it didn't hurt too bad.
 
Quality-wise, it should be the very same.
Connection-wise, it can make a BIG difference. Let me explain.

Separating the Audio Connection cable/port has no influence on the quality of the signal - it's digital, not analog - but can change a lot about WHAT the hdmi connection negotiates as valid format supported by the hdmi chain.

There's a well-known BUG into lots of Intel chipset that allows as playable audio format the maximum COMMON to the entire hdmi chain. It may be present in other chipset too, don't know.

The problem is, following the logic rule as above, it you have on the same hdmi chain:
- player that can do 8 channel / 192000 Hz / 24 bit
- amplifier that can do 8 channel / 192000 Hz / 24 bit
- screen (tv/projector...) that can do 2 channel / 48000 Hz /16 bit

you end up that you can't play anything more than 2 channel / 48000 Hz /16 bit.

Funny?

Separating audio and video chain allows for max resolution for both audio and video in a independent way, so the audio of the HDMI Video is removed or resampled if needed, and the audio of the HDMI audio can go at full blast regardless of what crappy audio capabilities you may have on the screen.
In my particular case, I don't send audio to my TV (turned off in the receiver's settings) - there are plenty of better speakers in the room. So, in your example, not having audio sent to the TV should eliminate any limitations the TV would create. Right?
Possibly in the case of my original question - "Why does the manual say HDMI #5 is 'better' quality sound" - the Sony is generating a high resolution video signal (black) if it doesn't detect a video signal from the incoming device in input #5. (Totally guessing here.)
 
In my particular case, I don't send audio to my TV (turned off in the receiver's settings) - there are plenty of better speakers in the room. So, in your example, not having audio sent to the TV should eliminate any limitations the TV would create. Right?
Possibly in the case of my original question - "Why does the manual say HDMI #5 is 'better' quality sound" - the Sony is generating a high resolution video signal (black) if it doesn't detect a video signal from the incoming device in input #5. (Totally guessing here.)

Some TVs cut costs by using less HDMI inputs with maximum compatibility. So rather than having ALL inputs with the latest standard, today that would be HDMI 2.1, they only, in your case, provide one.

So your input #5 has a later version HDMI with the POSSIBILITY of better sound and video based on a higher specification. If you aren't maximizing that input's specs, then it isn't going to matter and any input will be the same quality.
 
To be clear, the multiple HDMI inputs are on the receiver (and the one in question - #5), not my TV.

My system is designed for maximum sound performance. Video is secondary to me - 1080p is plenty (or 720 or even 480 for old stuff). 3D, 4K, curved screens - gives me no motivation to replace my current TV. If we are only watching TV programs I usually don't bother turning on the audio system. Hologram-vision might...
 
Last edited:
To be clear, the multiple HDMI inputs are on the receiver (and the one in question - #5), not my TV.

My system is designed for maximum sound performance. Video is secondary to me - 1080p is plenty (or 720 or even 480 for old stuff). 3D, 4K, curved screens - gives me no motivation to replace my current TV. If we are only watching TV programs I usually don't bother turning on the audio system. Hologram-vision might...

It is probably marketing BS with the one magical HDMI. I haven't really noticed receivers mixing HDMI versions with the different inputs like they regularly do on TVs, but maybe I haven't been paying enough attention to HDMI receivers.

Although video is secondary, remember that with HDMI they are linked together and to get the best audio, the video setting needs to be set to maximum.
 
In my particular case, I don't send audio to my TV (turned off in the receiver's settings) - there are plenty of better speakers in the room. So, in your example, not having audio sent to the TV should eliminate any limitations the TV would create. Right?

Yes.
 
There's a well-known BUG into lots of Intel chipset that allows as playable audio format the maximum COMMON to the entire hdmi chain. It may be present in other chipset too, don't know.


Separating audio and video chain allows for max resolution for both audio and video in a independent way, so the audio of the HDMI Video is removed or resampled if needed, and the audio of the HDMI audio can go at full blast regardless of what crappy audio capabilities you may have on the screen.
Due to the joys of non-backwards compatibility between HDCP 1.4 & 2.2 I have to use an audio/video splitter/extractor along with an HDCP converter with my Amp/NUC, and as the TV says it can do 5.1 that is all that the NUC will allow, if I go through the Oppo the NUC allows 7.1 audio!
 
(Forgive me if this has already been addressed elsewhere on the forum. - A search here did not turn up any answers for me.)
All my Oppos have both a regular HDMI output and an Audio Only output. My amp (a Sony) has 6 HDMI inputs and one of them is designated 'best' for audio only. The Sony's owners manual says to use that input for "better quality sound" but doesn't go into any detail as to why that particular HDMI input is "better". It also says you can use that input the same way as the other 5, meaning it can also process a video signal if you need it for something else instead. (another device, player, etc.) I couldn't find anything in an Oppo manual that addressed the advantages (if any) to using the Audio Only output. Neither stated a higher resolution, bit rate, etc. from those outputs and inputs.

Are there advantages, and if so, what are they?

And if there are advantages, is using the audio only ins & outs better (or the same) than using the 7.1 RCA outputs? (I have always thought the signals were the same, just easier with one plug instead of 8 audios and X# video) The Sony amp has 7.1 RCA inputs, so I can go either way - HDMI or separates.

(Unfortunately I am in the process of re-wiring/re-configuring and changing out devices in my whole system at the moment and can't do 'try it both ways and see'-type tests.)
You have to deal with any restrictions on HDMI ports case by case. Usually it's obvious stuff like audio being disabled across the board. Something like a quality/fidelity reduction would be a programmed processing thing (beyond just something to do with the port). Some of the 'copy protection gone wild' schemes do quality reduction as a form of copy protection. The 'gone wild' part is that is also reduces the quality for playback. (Can't copy it if you can't even play it!) This is just one of the things you have to vet when you buy. Look up the manual. Pester customer and sales service. Be quick to return stuff if something was dishonest. If they're altering the signal from one of the HDMI inputs... consider that a style of copy protection and avoid using it. (Or consider it a limited alternate input.)

If the digital data stream passes from point A to B, you're golden. And Oppo is the more 'happiness and light' company that doesn't engage in the 'gone wild' stuff to my understanding. Sony are about as dishonest as they come. They are the root of the 'copy protection gone wild' stuff. They do sell stuff with features disabled intentionally. Careful with them and don't be afraid to be a ruthless customer from hell!

As for the DA converter shootout...
If your HDMI receiver has better DA converters, go digital to the receiver and use those.
If the input device has better DA converters, you'd need to connect the analog outputs to the receiver's direct analog inputs to take advantage of that. Use care with the analog connections to make sure you don't lose that quality to dodgy connections! It's very easy to compromise analog connections and cause more loss there than the 2nd choice converters would have given you!
 
As for the DA converter shootout...
If your HDMI receiver has better DA converters, go digital to the receiver and use those.
If the input device has better DA converters, you'd need to connect the analog outputs to the receiver's direct analog inputs to take advantage of that. Use care with the analog connections to make sure you don't lose that quality to dodgy connections! It's very easy to compromise analog connections and cause more loss there than the 2nd choice converters would have given you!

This is 100% SPOT on.
 
Could you provide more detail as to how analog connections can be compromised?

An obvious example is a poor quality connection. You may have experienced gross cases. Ever have something seem curiously too low in volume (for where you're used to setting the knob) but still passing sound? Then you boost the volume up for a moment and it kind of 'forces' the signal through and the full volume comes back and sticks?

Now, that's an example of something clearly wrong of course! But the point is the part where the thing was still passing sound but at lower volume. If you were to analyze the signal when that's going on, you'd see degradation. Loss of highs and transient peaks and some distortion. The root cause is a poor connection acting like a resistor. There's chemestry going on with oxidation growing in the gaps of that poor connection.

Now think of the cases that aren't gross but have that kind of thing going on at 10% or 20%.
The system fidelity takes a little hit there. It's insidious to catch unless you get an opportunity to A/B something and hear it or it gets worse enough to reveal itself. Again, the gross case can be filed under "broken". But the 10% case might be a cheap unbalanced audio cable.

I said the word "unbalanced"...
Here's another opportunity for rf noise to get in. Unbalanced connections aren't bad per say. Keep things short. Avoid ground loops. Load the output properly. But the balanced signal cable can run 100' and cross power lines and you'll be none the wiser. Another thing to fuss over in the analog domain at any rate.

Analog is intuitive and immediate. Put power in a wire and it's just right there. There's no cryptic voodoo and computerized sub-components running behind the scenes. It's just intuitive. But get a dodgy connection somewhere and... have fun with that!
Digital on the other hand has a couple interesting charms. The part where all the expense and difficult signal handling is all on the front and back ends with the AD and DA converters. Those become over the top critical! But everything in-between is just shuttling ones and zeros around. No 0.0000000000000000000000001 millivolt value to preserve anywhere. Just simple ones and zeros. You still DO have to mind that data to make sure it isn't getting altered or corrupted. But the nature of the encoded signal means you usually get glaring artifacts when the one/zero pipeline breaks. You can do stuff like subtract one file from another. If they have the exact same set of ones and zeros, they will subtract perfectly to zero. That gives you an absolute way to critique an audio file against the master that might otherwise have differences that were below your perception.

I tend not to trust digital devices that can't connect to a computer and controlled as I please. That led to computer as media player and FLAC files. But I digress.

That help?
 
Back
Top