One can see lots of devices that claim to "correct" jitter problems and you
see them use verbs like "reclocking" to correct jitter errors. Are you
saying this is a myth, and that all that can be done is for a device to
prevent jitter at the outset, not fix it? But if devices can fix an
incoming stream of data to correct jitter errors, it would seem that a
recording containing a data stream with jitter errors could just as easily
be corrected. A data stream is a data stream. I don't have any answers,
just questions ...
On Mon, Feb 11, 2013 at 10:11 AM, Don Cox <[log in to unmask]> wrote:
> On 11/02/2013, Tom Fine wrote:
> > Can jitter be introduced on the A-D stage? As I understood Mike Gray's
> > posting, he was saying jitter can be induced from the get-go, in the
> > A-D process. Konrad, do you know that to be untrue?
> A->D involves sampling the analog voltage at regular intervals. If the
> intervals are not exactly regular (i e jitter), the digital record cannot
> I can't think of any way of correcting such recordings.
> > Also, I've been told by one of Sony's senior EE guys that it can be
> > baked into a glass master. As I understand it, jitter can be induced
> > any time the bits are clock-aligned for whatever reason. I'm not sure
> > why that occurs in making a glass master, but a lot of research was
> > done on this back in the 80s and 90s, at least that's my understanding
> > from what the Sony guy told me.
> > So, I think (but may have learned this wrong, I'm not an EE) that bits
> > is bits only when the bits are kept absolutely intact and the
> > timing-transmission is rock solid.
> An example of where jitter is not important is downloading a file from
> the web. The timing of arrival of the bits (which will be in packets) is
> not important, so long as they end up in the correct order in the file.
> Don Cox
> [log in to unmask]