View Single Post
  #39 (permalink)  
Old December 1st 09, 03:51 PM posted to uk.rec.audio
Jim Lesurf[_2_]
external usenet poster
 
Posts: 2,668
Default Low capacitance audio coax

In article , Dave Plowman (News)
wrote:


Err.. no. The DIN system worked with high output impedences and low
input impedances. There was, of course, significant signal attenuation
in so doing with consequent S/N ratio implications, but it did reduce
the HF loss due to cable capacitance.


You've got me confused there. Thought low out high in was the rule.


Yes. That's what has been generally adopted for tasks like domestic audio
where the idea is that it is the voltage pattern at the input terminals of
the 'load' (destination) that defined the waveform.

The idea behind the DIN 'electrical' standard was the obverse of the above.
The approach was to define the signal waveform in terms of the *current*
pattern entering the destination. Thus it reversed the approach people are
familiar with and had low input impedances combined with high source
impedances.

For short cables in both cases the cable capacitance combines with the
source and load impedances in parallel. So the outcome is similar in terms
of the primary RC low-pass effect. But in DIN 'electrical' terms you can
think of this as being a consequence of the low load resistance meaning you
don't need to significantly change the voltage on the cable. If effect, the
load resistance is so small that you aren't having to change the charge on
the cable capacitance very much so most of the current the source injects
ends up going though the load. :-)

However that meant that all you'd really done was turn around the
requirement so you now neede a low load resistance rather than a low
source impedance if you wanted to maximise the bandwidth provided when
cable capacitance was taken into effect.

So in the end if anyone had been really worried by that it would have made
more sense to use a system that was closer to being matched rather than
idealise one of the mismatch extremes like voltage or current transfer!
Given that they were making up a new 'standard' I assume they could have
done that, but it would have meant defining a cable standard as well
as ones for source and load. Hardly rocket science, though!...


I had a Quad 3 series that used DIN connectors throughout - but that was
all low(ish) out high(ish) in.


Many people (including Armstrong) adopted the DIN plugs because they were
compact for stereo and we assumed they'd become the standard. But despite
adopting the physical plugs and sockets, stayed with the tradition of using
voltage transfer pattern. So used low source impedance and high load
impedances for optimum voltage transfer.

That said, the Armstrong 600s did have (without mentioning it in the
handbooks) a second 'tape out' with a high impedance to drive any recorders
made to the DIN electrical standard. Maybe Quad had a keymatic board for that,
but I can't recall off-hand. The usual trick was just to shove in large
series resistors at source to get to the defined current level.

Quite why DIN decided to adopt that approach I can't recall. Whatever their
theory, people ended up ignoring them. :-)

Slainte,

Jim

--
Please use the address on the audiomisc page if you wish to email me.
Electronics http://www.st-and.ac.uk/~www_pa/Scot...o/electron.htm
Armstrong Audio http://www.audiomisc.co.uk/Armstrong/armstrong.html
Audio Misc http://www.audiomisc.co.uk/index.html