96Khz vs 192Khz

QuadraphonicQuad

Help Support QuadraphonicQuad:

This site may earn a commission from merchant affiliate links, including eBay, Amazon, and others.
It is incredibly unlikely that Clock jitter will be audible, for example on a 1Gbit Ethernet link we would be looking for an RMS jitter figure for the clock of around sub-1ps, so 1ps spread/smear in the clock edge. Bad clock jitter would create bit errors, but you'd be looking at a jitter figure which is a large a % of the clock period. Clock jitter at the ADC or DAC can look like a decrease in the the effective signal to noise. So if jitter/phase-noise is an issue the equipment is junk.

However, if on a high speed serial link like HDMI/USB 3 etc. there is a termination impedance mismatch which will cause reflections, this will most often be the cause of bit errors. The severity of the error is down to how bad the mismatch is, length of cable, data rate etc. Again if this occurs then the equipment has been badly designed and should go in the garbage bin.

Bit errors will occur no matter what, they are inevitable, but 99.99% of the time you won't notice them, partly due to error correction techniques, but usually because they are insignificant. If they become audible then the equipment is failing or something like disc rot is occurring. I have had it occur when the laser module on an old CD player started to fail, and more annoyingly due to CD/BDA disc problems, neither of which the player could correct as there were too many errors.
 

Attachments

  • Textronix-TheMeaningOfTotalJitter.pdf
    154.7 KB · Views: 161
Talk of jitter makes me wonder if anyone worried about jitter has ever proven they heard it.

Audible jitter is one of the great bogeymen of audiophile lore.

Compare an original CD commercially-manufactured from a glass master, and one made in the last 20 years or so (to avoid possible audio pre-emphasis gremlins). Now rip the same CD to a lossless format that's faithful to the disc (44.1kHz 16bit PCM), and burn it back to CDR at the highest possible burn speed.

While it's burning, consider how much error is introduced by vibrations in the high speed spin, along with the deviations from exactly and evenly spaced marks on the dye layer compared with where they might land if the burn was in real-time (1x), AND the fact that the power of the laser is now spread very, very thinly to produce those marks in the CDR dye.

When you play that high-speed burn copy back, your player will likely be struggling to keep up with the massive amount of errors you've introduced compared with the original.

The hardness you hear is jitter, AKA the plaintive cry of a DAC trying to keep up. No bogeys.
 
Nope, that's error correction artifacts. Jitter refers to the sample clock intervals being inconsistent. That time smears the audio. They're small intervals so... jitter. Not related to a CD wobbling and leading to read errors.

There were still sample rate clocks with audible jitter in the late 1980s and early '90s. You could upgrade the sound of the converters in early ADAT machines with an external time clock. Maybe up to the late '90s you could still upgrade the early audio interfaces with an external clock. Nowadays this is a moot point even in the lowliest consumer AVR. Maybe if you try running the clock at 192k in a really cheap unit though. Which is kind of funny.

A CD rattling around getting read errors is a different matter. Let's hear it for the built in checksum in FLAC files! On the rare-ish case of a bad download, you spot it when the checksum fails and try again. I miss album covers but there are things to like about this stuff. Like perfect HD surround files to listen to.
 
Last edited:
Nope, that's error correction artifacts. Jitter refers to the sample clock intervals being inconsistent. That time smears the audio. They're small intervals so... jitter. Not related to a CD wobbling and leading to read errors.

There were still sample rate clocks with audible jitter in the late 1980s and early '90s. You could upgrade the sound of the converters in early ADAT machines with an external time clock. Maybe up to the late '90s you could still upgrade the early audio interfaces with an external clock. Nowadays this is a moot point even in the lowliest consumer AVR. Maybe if you try running the clock at 192k in a really cheap unit though. Which is kind of funny.

A CD rattling around getting read errors is a different matter. Let's hear it for the built in checksum in FLAC files! On the rare-ish case of a bad download, you spot it when the checksum fails and try again. I miss album covers but there are things to like about this stuff. Like perfect HD surround files to listen to.

Nope. If the digital representations on the disc are deviated by rapid writing, guess what? It equates to higher inconsistentcy in interval spacing. So my example describes both error artifacts and jitter.
 
The sample rate clock is the thing prone to jitter. Not data retrieval from the disc. The data comes off the storage medium with errors or not. The DA converter you're listening to the output from is the thing you would hear the effects of jitter from.

At least this is the "jitter" being talked about regarding sample rate clocks. If this is miscommunication with the term being applied to stability of a spinning disc, then here we are.
 
This goes to the basic question of, what creates “better“ in audio? And is the new different thing necessarily better?

In many cases 192K sounds different than 96K, but is it necessarily better? Too many people in audio, I think, find something that sounds different then what they had before, and automatically assume it is “better“. I’ve been a violinist for over 50 years and played in live orchestras for about that long, and I can guarantee you than sitting in a live orchestra the bass is nowhere as big as when people add a subwoofer, but they swear that a subwoofer is “better“. Or that they must have a perfectly flat 20 Hz to 20,000 Hz system and room response, when the real world is nothing like that. I have never heard a recording reproduced on any system truly sound like a live orchestra that you’re sitting in the middle of, or even sitting in “Row 16 center”. Sound reproduction no matter how good the system, is never the real thing, but to me feels more like something an artist painted that is kind of like the real thing, so that may br about the best you can do. Who doesn’t like a beautiful painting?
And of course the other side of that “better” issue is the status of having “better” gear. Don’t even get me started with that. I DIY power cords and interconnects that are light years beyond some of the expensive stuff you can get. If you have a little imagination and don’t mind rolling your sleeves up, you can have a great result for relatively very little.
Bottom line: if you like it, great. If you can, be happy with that. But if it isn’t quite the real thing, there’s always something that could sound better and you could continually be in that loop of dissatisfaction with the sound, tilting at audio-demon windmills continually trying to find the ultimate sound, which does not exist in reproduced audio once you’ve heard live music or even played it. If you can resist being a junkie audiophile needing the next better “hit“, be happy with being happy with what you have.
You might even try the “audio think method“ and imagine your system is the greatest thing ever and could never be better than that. Yeah, right. Ha ha

Truth be told, a musician in an orchestra rarely gets caught up in the music, and is mostly thinking about how to play better. Sounds familiar, doesn’t it?


Check out the music starting at about 18:00.

Dragnet The LSD Story Blueboy (Episode 1)
 
The sample rate clock is the thing prone to jitter. Not data retrieval from the disc. The data comes off the storage medium with errors or not. The DA converter you're listening to the output from is the thing you would hear the effects of jitter from.

At least this is the "jitter" being talked about regarding sample rate clocks. If this is miscommunication with the term being applied to stability of a spinning disc, then here we are.
Yes, it is this sample rate clock Jitter which is being referred to, and there are wide number of excellent techniques/circuits to minimise jitter in clocks/clock recovery. Clock jitter is inaudible.

Data on a CD is Eight-to-Fourteen Modulated so every 8-bits of data results in 17 bits on the disc (DVD & SACD use EFM+). So the clock is recovered from the data stream and used, then the error correction is applied (Both Block & Word error correction methods are used). Using EFM the pit and land is always guaranteed to be 3 clocks long, which eases the optical system requirements.

There will be jitter seen in a eye diagram when looking at the data, this is due to variations in the reading of the pits & lands, again if a good design of both the mechanical and electronic elements then this will be small. In Writeable media the high speed write errors are often caused by the laser not being on long enough to create the change in the dye which needs to be seen when reading back.

Read/Write errors due to duff/badly designed equipment are another thing.
 
Compare an original CD commercially-manufactured from a glass master, and one made in the last 20 years or so (to avoid possible audio pre-emphasis gremlins). Now rip the same CD to a lossless format that's faithful to the disc (44.1kHz 16bit PCM), and burn it back to CDR at the highest possible burn speed.

While it's burning, consider how much error is introduced by vibrations in the high speed spin, along with the deviations from exactly and evenly spaced marks on the dye layer compared with where they might land if the burn was in real-time (1x), AND the fact that the power of the laser is now spread very, very thinly to produce those marks in the CDR dye.

When you play that high-speed burn copy back, your player will likely be struggling to keep up with the massive amount of errors you've introduced compared with the original.

The hardness you hear is jitter, AKA the plaintive cry of a DAC trying to keep up. No bogeys.

Aside from the fact that this is incorrect,... even if it was true that 'vibration' error correction = jitter (it's not), demonstrate that you actually hear a difference, first.
 
At least this is the "jitter" being talked about regarding sample rate clocks. If this is miscommunication with the term being applied to stability of a spinning disc, then here we are.

In the days before EAC, CDRWin used the word "jitter" to describe a CD reader doing...uhhh...something bad. I think it referred to the laser stopping to let the computer catch up (?) and then not starting again precisely where it left off.

We're talking about 20+ years ago, so my memory is imperfect at best, but I remember having to tick a box for some kind of jitter avoidance because if I didn't I'd get weird things like briefly-repeated passages.
 

Here, let me quote the most informative text from that typically test-evidence-free AVSF thread from 20 years ago.

It's by Eric Benjamin of Dolby Labs, who actually studied jitter audibility and published on it in 1998:

It's all of that that Ben Gannon and I were worried about, and what we investigated to death in "Theoretical and Audible Effects of Jitter on Digital Audio Quality", AES Preprint 4826. The bottom line turned out to be that it takes a tremendous amount of jitter to become audible - unlike what had mostly been published before.
.
.
Our estimate of the audibility of jitter, based on actual listening tests, is about 3 orders of magnitude less sensitive than most published estimates. But listening tests have only been done twice. Once in about 1974 at the BBC, and once by me. My work and the BBC work agree.

And again, zero to do with the subject of this thread: audibility of 192 vs 96 kHz sample rates
 
Here, let me quote the most informative text from that typically test-evidence-free AVSF thread from 20 years ago.

It's by Eric Benjamin of Dolby Labs, who actually studied jitter audibility and published on it in 1998:



And again, zero to do with the subject of this thread: audibility of 192 vs 96 kHz sample rates

Jitter causes noise, distortion and other artifacts. The pros all use highly stable clocks. Sometimes external.

If you are an audiophile you should be concerned about jitter, too...
 
Jitter causes noise, distortion and other artifacts. The pros all use highly stable clocks. Sometimes external.

If you are an audiophile you should be concerned about jitter, too...

No audibility test evidence on offer, again. No measurements of consumer gear showing levels of jitter that are likely to be audible. Meanwhile if you like, I can point you to Benjamin & Gannon's actual audibility tests from 1998 that showed nanosecond levels of jitter are necessary for it to be audible using *pure sine wave* signals in the most sensitive audible range ( 8 kHz) .

Think about what that means for audibility in modern gear (where jitter is typically measured in picoseconds -- true for decades), using complex (i.e., music) signals.

tl;dr: like so many 'audiophile' OCD-like fixations, jitter is not something for rational listeners at home to worry about in 2021, if it ever was.
 
Here, let me quote the most informative text from that typically test-evidence-free AVSF thread from 20 years ago.

It's by Eric Benjamin of Dolby Labs, who actually studied jitter audibility and published on it in 1998:



And again, zero to do with the subject of this thread: audibility of 192 vs 96 kHz sample rates
I bow to the wisdom of Eric Benjamin. Thanks for sharing.
 
Back
Top