This is too funny. Believe me, I'm am digital's biggest fan. I'm just trying to clear up that digital can never be more accurate than the original analog signal.
So now we're talking about 'accuracy'? That word's new to the thread.
Accuracy is measurable. No analog-to-analog or analog-to-digital copy can be perfectly accurate (much less 'more accurate'). A digital-to-digital copy can be perfectly accurate.
For, say, a recording of a string quartet, the 'original' acoustic signal is sound waves in air. That's not an analog of anything (though by convention we call continuous phenomena 'analog' and sampled phenomena 'digital'). The first analog of those sound waves is created at the microphone stage: an electrical analog. Then an analog to that is made when voltages are printed to magnetic tape. Then another as tape is transcribed to vinyl. There are accuracy losses from step to step here, with each step being the 'original signal' for the next step, and cumulative losses of accuracy compared to the original sound waves. Those losses can be reduced if digital stages are introduced after the microphone stage. But you get another whack against accuracy at the louspeaker stage. Such is life.
So why, again, is digital being singled out for 'inaccuracy' here?