Oh man this is so simple.
The clock has to be stable and the pulse train evenly spaced, whether its magnetic domain playback, or digital word/pulsed/filtered.
Analog, digital, whatever.
Low frequency rate analog jitter I.e flutter causes blurring and distortion in a fairly direct manner, shifting the signal in the frequency domain. Average amplitude HF response is slightky higher, obviously more coherence. As transients smear, everything's duller.
Digital jitter creates crazy situations where adjacent positive-going samples can actual develop negative hitches and overshoots because the number going in is right, but early or late..so the output waveform is mis-shapen.
This is simply a loss of synchronicity with the original waveform. Picture a series of 44.1k samples for ten perfect seconds Then one of them arrives next week, Followed by 10 more an hour later, then back to 44.1k... That will be a very mis-shapen output wave, with a stepped response that the filter won't smooth. Extreme case but identical.
Early ladder DACs had also had long settling times and capacitive voltage retention that could create further errors, they were, after all, sample and holds. So that created a mambo effect.
All of this manifests as graininess and image collapse. So, by the way does fast flutter and scrape flutter.
If the error correction kicks in on a CD it will likely get the data back. If it's copied to hard drive as data it will either reject the disc as unreadable or the data will be stored perfectly.
Asymmetrical data delivery as contrasted with waveform reconstruction is immaterial or every piece of downloaded software would crash. Bits is bits whether it takes a microsecond to transmit and store or a week.
But to reconstruct the wave the buffered data has to come out perfectly stepped in time. Which these days is easy. An Iphone 4s playing a .wav sounds amazing - wolfenson DAC, low jitter.
That's all it takes. Better analog circuits? Sure. But timing is more important.
What we do is similar, but with tape. Using a 150hHz click from the original recorder we recover something much closer to the original wave shape before the mechanical jitter, again different spectra but basically identical, the transport distorts the wave shape on recording and further on playback. Similar perceptual effects when the time-base is restored.
Please pardon the misspellings and occassional insane word substitution I'm on an iPhone
On Feb 11, 2013, at 12:33 PM, Tom Fine <[log in to unmask]> wrote:
> Hi John:
>
> I think Don was saying the same thing I have been told by more than one learned EE specializing in digital recording and playback, plus what Mike Gray was saying (as I interpreted it) -- you can fix jitter that is introduced in the A-D stage. You CAN strip out jitter from the D-A stage by re-clocking the data stream (which is what Benchmark and others do). Preferred USB interfaces are asynchronous, which (I think) means they do not rely on the unreliable computer clock but rather strip out clock information from the incoming data stream and re-clock it. I think this involves cache-ing a certain amount of data, then applying the new internal clock, sending that to the DAC, which is locked to the internal clock, and thus removing jitter from the incoming data stream.
>
> Konrad, please specify what of Mike Gray's posting you are calling myth. Are you saying that jitter can't be introduced in the A-D stage?
>
> By the way, this is a really interesting discussion. I'm waiting for Goran and perhaps others to weigh in with some error-correction!
>
> -- Tom Fine
>
> ----- Original Message ----- From: "John Haley" <[log in to unmask]>
> To: <[log in to unmask]>
> Sent: Monday, February 11, 2013 12:13 PM
> Subject: Re: [ARSCLIST] Jitter (was Re: [ARSCLIST] Audibility of 44/16 ?)
>
>
>> Don,
>>
>> One can see lots of devices that claim to "correct" jitter problems and you
>> see them use verbs like "reclocking" to correct jitter errors. Are you
>> saying this is a myth, and that all that can be done is for a device to
>> prevent jitter at the outset, not fix it? But if devices can fix an
>> incoming stream of data to correct jitter errors, it would seem that a
>> recording containing a data stream with jitter errors could just as easily
>> be corrected. A data stream is a data stream. I don't have any answers,
>> just questions ...
>>
>> Thanks,
>> John Haley
>>
>>
>>
>> On Mon, Feb 11, 2013 at 10:11 AM, Don Cox <[log in to unmask]> wrote:
>>
>>> On 11/02/2013, Tom Fine wrote:
>>>
>>> > Can jitter be introduced on the A-D stage? As I understood Mike Gray's
>>> > posting, he was saying jitter can be induced from the get-go, in the
>>> > A-D process. Konrad, do you know that to be untrue?
>>> >
>>> A->D involves sampling the analog voltage at regular intervals. If the
>>> intervals are not exactly regular (i e jitter), the digital record cannot
>>> be
>>> accurate.
>>>
>>> I can't think of any way of correcting such recordings.
>>>
>>> > Also, I've been told by one of Sony's senior EE guys that it can be
>>> > baked into a glass master. As I understand it, jitter can be induced
>>> > any time the bits are clock-aligned for whatever reason. I'm not sure
>>> > why that occurs in making a glass master, but a lot of research was
>>> > done on this back in the 80s and 90s, at least that's my understanding
>>> > from what the Sony guy told me.
>>> >
>>> > So, I think (but may have learned this wrong, I'm not an EE) that bits
>>> > is bits only when the bits are kept absolutely intact and the
>>> > timing-transmission is rock solid.
>>> >
>>> An example of where jitter is not important is downloading a file from
>>> the web. The timing of arrival of the bits (which will be in packets) is
>>> not important, so long as they end up in the correct order in the file.
>>>
>>>
>>>
>>> Regards
>>> --
>>> Don Cox
>>> [log in to unmask]
>>>
|