Hi John:

I think Don was saying the same thing I have been told by more than one learned EE specializing in 
digital recording and playback, plus what Mike Gray was saying (as I interpreted it) -- you can fix 
jitter that is introduced in the A-D stage. You CAN strip out jitter from the D-A stage by 
re-clocking the data stream (which is what Benchmark and others do). Preferred USB interfaces are 
asynchronous, which (I think) means they do not rely on the unreliable computer clock but rather 
strip out clock information from the incoming data stream and re-clock it. I think this involves 
cache-ing a certain amount of data, then applying the new internal clock, sending that to the DAC, 
which is locked to the internal clock, and thus removing jitter from the incoming data stream.

Konrad, please specify what of Mike Gray's posting you are calling myth. Are you saying that jitter 
can't be introduced in the A-D stage?

By the way, this is a really interesting discussion. I'm waiting for Goran and perhaps others to 
weigh in with some error-correction!

-- Tom Fine

----- Original Message ----- 
From: "John Haley" <[log in to unmask]>
To: <[log in to unmask]>
Sent: Monday, February 11, 2013 12:13 PM
Subject: Re: [ARSCLIST] Jitter (was Re: [ARSCLIST] Audibility of 44/16 ?)

> Don,
> One can see lots of devices that claim to "correct" jitter problems and you
> see them use verbs like "reclocking" to correct jitter errors.  Are you
> saying this is a myth, and that all that can be done is for a device to
> prevent jitter at the outset, not fix it?  But if devices can fix an
> incoming stream of data to correct jitter errors, it would seem that a
> recording containing a data stream with jitter errors could just as easily
> be corrected.  A data stream is a data stream.  I don't have any answers,
> just questions ...
> Thanks,
> John Haley
> On Mon, Feb 11, 2013 at 10:11 AM, Don Cox <[log in to unmask]> wrote:
>> On 11/02/2013, Tom Fine wrote:
>> > Can jitter be introduced on the A-D stage? As I understood Mike Gray's
>> > posting, he was saying jitter can be induced from the get-go, in the
>> > A-D process. Konrad, do you know that to be untrue?
>> >
>> A->D involves sampling the analog voltage at regular intervals. If the
>> intervals are not exactly regular (i e jitter), the digital record cannot
>> be
>> accurate.
>> I can't think of any way of correcting such recordings.
>> > Also, I've been told by one of Sony's senior EE guys that it can be
>> > baked into a glass master. As I understand it, jitter can be induced
>> > any time the bits are clock-aligned for whatever reason. I'm not sure
>> > why that occurs in making a glass master, but a lot of research was
>> > done on this back in the 80s and 90s, at least that's my understanding
>> > from what the Sony guy told me.
>> >
>> > So, I think (but may have learned this wrong, I'm not an EE) that bits
>> > is bits only when the bits are kept absolutely intact and the
>> > timing-transmission is rock solid.
>> >
>> An example of where jitter is not important is downloading a file from
>> the web. The timing of arrival of the bits (which will be in packets) is
>> not important, so long as they end up in the correct order in the file.
>> Regards
>> --
>> Don Cox
>> [log in to unmask]