"Brian-Gaff" wrote in message
news

In the very start of digital some recordings looked fantastic on specs but
sounded crap, seeming like they had some kind of noise gate on them. then
came dither, which made recordings sound right again. I'll just leave this
hanging there....
Brian
Yes. Dither is required when converting from
32, 24 or 20 bits to16bits. It is done in the mastering
process, (not mixdown) and without it, mastering
software truncates all signals to which
wordlength reduction is applied.
My first experiences of digital recording
were back in the mid/late 1970's when
I heard some recordings made on the
Denon DN34. The machine's predecessor
had apparently been demonstrated as early
as 1972. I was very impressed!
Shortly after that, Decca,the record
company where I worked in the UK,
designed and built their own digital
recording/editing system for in-house use.
http://www.kolumbus.fi/iain.churches...alRecorder.jpg
Details from a Decca promo leaflet:
** The top rack housed an 18 bit D/A converter
for SPDIF, AES. The second rack housed a
20 bit A/D which converted the studio line level
signal to digital data stream in the Decca format.
SR 48kHz A signal processing unit (euro card frame)
that takes the digital data stream and converts it into a
signal that a modified video transport can record
(such as adding the various video sync pulses).
It also generates timecode and error correction data.
Working simultaneously in record and playback it also
displays off-tape record level via a peak hold PPM. It
reads the off-tape timecode and monitors the overall
quality of the recording (by counting lost samples).
It also has a limited amount of system
self-diagnostics.**
Iain