Print

Print


IIRC, the Dolby B reference level was 160 nano-webers/meter. If the tapes
were recorded at that level and if you have an external decoder and a test
tape with that level (or a known level for which you can correct,) then
simply play the test tape and adjust the meter on the Dolby B decoder to
the Dolby mark. Then remove the test tape and play the Dolby B tape
that has no test tone. You should be fairly close to the correct algorithm.

I recorded thousands of open reel tapes with a Dolby B encoder. The encoder
was made by Dolby for radio stations on the A301 frame. I used an Ampex
AG350-2. I couldn't afford a Dolby A decoder for home use, and I wanted to
copy many Dolby A tapes for myself.

I have seen thousands of Dolby B 10.5 inch 2 track reel to reel tapes. They
used Dolby B because the Dolby A encoder was very expensive.

BTW Aaron, I have a couple of Dolby B decoders, including the Advent. It
and the better of the two Teac Dolby Bs were the best of the home units.
You are welcome to borrow one.

Kevin Mostyn





In article <[log in to unmask]>, you wrote:
>Has anybody ever written the Dolby encode/decode algorithm in Windows-based
>software?
>
>The question arose when a friend of mine told me about some orphaned Dolby
>B-encoded tapes. These apparently are reel-to-reel and without calibration
>tones. Granted, without the calibration tone, decoding of the tapes would be
>pure guesswork. However, a computer-based program would at least allow
>non-real-time experimentation so that a plausible reference level could be
>established.
>
>Aaron Z Snyder
>

--


-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-
Kevin P. Mostyn