Print

Print


On 11/02/2013, Tom Fine wrote:

> Can jitter be introduced on the A-D stage? As I understood Mike Gray's
> posting, he was saying jitter can be induced from the get-go, in the
> A-D process. Konrad, do you know that to be untrue?
> 
A->D involves sampling the analog voltage at regular intervals. If the
intervals are not exactly regular (i e jitter), the digital record cannot be
accurate. 

I can't think of any way of correcting such recordings.

> Also, I've been told by one of Sony's senior EE guys that it can be
> baked into a glass master. As I understand it, jitter can be induced
> any time the bits are clock-aligned for whatever reason. I'm not sure
> why that occurs in making a glass master, but a lot of research was
> done on this back in the 80s and 90s, at least that's my understanding
> from what the Sony guy told me.
> 
> So, I think (but may have learned this wrong, I'm not an EE) that bits
> is bits only when the bits are kept absolutely intact and the
> timing-transmission is rock solid.
> 
An example of where jitter is not important is downloading a file from
the web. The timing of arrival of the bits (which will be in packets) is
not important, so long as they end up in the correct order in the file.



Regards
-- 
Don Cox
[log in to unmask]