View Single Post
  #19 (permalink)  
Old January 15th 11, 12:35 PM posted to uk.rec.audio
Don Pearce[_3_]
external usenet poster
 
Posts: 1,358
Default ASA and Russ Andrews again;!...

On Sat, 15 Jan 2011 11:41:02 +0000 (GMT), Jim Lesurf
wrote:

In article , Don Pearce
wrote:
On Fri, 14 Jan 2011 17:13:42 +0000 (GMT), Jim Lesurf
wrote:


In article , Don Pearce
wrote:
On Fri, 14 Jan 2011 09:57:44 -0500, "Arny Krueger"


No, this simply isn't so. Matching a cable properly results in a flat
frequency response and a flat group delay. This reshapes the signal
perfectly.

Not if the cable loss changes with frequency. You can optimise by
playing with the matching, but not necessarily get a perfect output.

Over the kinds of frequency we are dealing with in audio, cables are
sensibly flat regards loss.


However I was picking up your unqualified statement that mathcing would
give a 'perfect output'. I agree the departure from perfection should not
normally be an *audible* problem. Indeed, if peoplw want to worry about
they should worry about the LF cable impedance departing from the nominal
value at high freqencies. :-)


I've always found it to be the other way round. Cable impedance is
pretty stable at high frequencies - it is only at low (kHz)
frequencies that the series and parallel resistance terms start
driving the impedance upwards. And of course at those low frequencies
it doesn't matter, because the electrical length is so short.


Sure there will still be errors, but of minuscule magnitude. I don't
think I've ever seen a cable that wasn't good for hundreds of MHz if
used properly.


What spdif cable bandwidth is required for, say 100 ps of jitter with the
J-test? I'm curious about this as I'm wondering about transferring
192k/24bit as well as ye olde 44.1k/16bit.


Not sure - length is just as important as impedance, of course. It
would take a pretty horrible piece of cable to have a bandwidth low
enough to induce jitter of those kinds of performance. Of course what
matters is not the cable, but the entire system of which it is a part.
You can point the finger almost anywhere when it comes to timing
errors.

As I'm currently looking at some DACs I've noticed statements that optical
spdif is limited to 48k. That isn't coax of course, but it made me wonder
where this stated limit is coming from, or even if it is true. That in turn
makes me wonder about jitter and transfer bandwidth/dispersion when people
are using higher sample rates, etc, than the now-traditional CD standard.


I have a newly designed piece of kit testing right now, a Ka band
(30GHz) transceiver linking up and down from a satellite. The system
contains about a kilometre of assorted cables, twenty miles of fibre,
many, many filters, mixers, vaguely linear amplifiers and microwave
dishes being blown around by the bad weather. I don't know what the
bit error rate is yet, because it has only been running since just
before Christmas and there hasn't yet been an error. What I'm saying
is that I don't understand how any designer could ever have a problem
with a few feet of cable. I'm astounded.

d