Audio Banter

Audio Banter (https://www.audiobanter.co.uk/forum.php)
-   uk.rec.audio (General Audio and Hi-Fi) (https://www.audiobanter.co.uk/uk-rec-audio-general-audio/)
-   -   ASA and Russ Andrews again;!... (https://www.audiobanter.co.uk/uk-rec-audio-general-audio/8348-asa-russ-andrews-again.html)

David Looser January 15th 11 12:59 PM

ASA and Russ Andrews again;!...
 
"Don Pearce" wrote

What I'm saying
is that I don't understand how any designer could ever have a problem
with a few feet of cable. I'm astounded.


Here here! The idea that a little bit of jitter on a link between a digital
audio source and sink should cause measurable or audible degradation of the
analogue output is amazing. Do the designers of digital audio devices not
know how to design a decent clock recovery circuit?

David.



Jim Lesurf[_2_] January 15th 11 03:22 PM

ASA and Russ Andrews again;!...
 
In article , David Looser
wrote:
"Don Pearce" wrote


What I'm saying is that I don't understand how any designer could ever
have a problem with a few feet of cable. I'm astounded.


Here here! The idea that a little bit of jitter on a link between a
digital audio source and sink should cause measurable or audible
degradation of the analogue output is amazing.


You may think so. But in practice PM and others (starting with Julian Dunn
IIRC - after whom the 'J test' waveform is named) have been measuring the
effect on the output on DACs for many years to get the data induced jitter
values they publish.

How audible it is in most cases another kettle of worms. :-)

Do the designers of digital audio devices not know how to design a
decent clock recovery circuit?


Lock-loops can reduce phase noise, but not to zero. And if they don't track
slow variations in the input (multiplexed) clock then a buffer may
eventually reach its end-stop from beginning 'half full'. So the results
can be measurable. What is less clear is what designers do in each case.

That said, I tend to think that the jitter is low enough not to be and
audible problem (to me, at least!) with the DACs I've tended to use. So I'm
really discussing this on the basis that it is measurable even if in my
general experience 'harmless'.

FWIW if you look in *last* month's HFN you can see measurements by PM of
how changing the *USB* cable feeding a DACMagic also changed the jitter
spectrum for a J test. You may not be surprised, though, to know that the
cable that is longer and cheaper gave lower jitter. ;-

Slainte,

Jim

--
Please use the address on the audiomisc page if you wish to email me.
Electronics http://www.st-and.ac.uk/~www_pa/Scot...o/electron.htm
Armstrong Audio http://www.audiomisc.co.uk/Armstrong/armstrong.html
Audio Misc http://www.audiomisc.co.uk/index.html


Jim Lesurf[_2_] January 15th 11 03:36 PM

ASA and Russ Andrews again;!...
 
In article , David Looser
wrote:
"Jim Lesurf" wrote

What spdif cable bandwidth is required for, say 100 ps of jitter with
the J-test? I'm curious about this as I'm wondering about transferring
192k/24bit as well as ye olde 44.1k/16bit.


The bit depth should make no difference, as SPDIF transmits 32 bits (24
of which are available for audio data) per sample regardless of the bit
depth of the transmitted audio. Any unused bits are simply set to zero.
OTOH the bit rate of the SPDIF link will scale with the audio sample
rate.


Thanks. Yes, that prompted me to look this up in my copy of Watkinson (Art
of Digital Audio, pg 450 in my copy).

So 32 bits per sample-(subframe) frame and biphase modulation by being
XORed with the clock in quadrature. In effect a nominal suppressed carrier
frequency of 32 cycles per subframe.

So for 192k samples per second that comes out as
192,000 x 2 x 32 = 12.288 MHz

Have I made an error in the above, or does that seem correct?


As I'm currently looking at some DACs I've noticed statements that
optical spdif is limited to 48k.


The toslink transmitters and receivers I've bought recently claim to be
good to 13Mb/s, which should allow a 96kHz sample rate without
problems. 192kHz would be pushing it.


Curious that some maker's documents and other things I've read claim it is
limited to 48k/24bit. Maybe this is a limit of the actual optical TX and RX
they use, and they then say it is inherent to the system.

Slainte,

Jim

--
Please use the address on the audiomisc page if you wish to email me.
Electronics http://www.st-and.ac.uk/~www_pa/Scot...o/electron.htm
Armstrong Audio http://www.audiomisc.co.uk/Armstrong/armstrong.html
Audio Misc http://www.audiomisc.co.uk/index.html


Jim Lesurf[_2_] January 15th 11 03:41 PM

ASA and Russ Andrews again;!...
 
In article , Don Pearce
wrote:
On Sat, 15 Jan 2011 11:41:02 +0000 (GMT), Jim Lesurf
wrote:



However I was picking up your unqualified statement that mathcing would
give a 'perfect output'. I agree the departure from perfection should
not normally be an *audible* problem. Indeed, if peoplw want to worry
about they should worry about the LF cable impedance departing from the
nominal value at high freqencies. :-)


I've always found it to be the other way round. Cable impedance is
pretty stable at high frequencies - it is only at low (kHz) frequencies
that the series and parallel resistance terms start driving the
impedance upwards.


Hence my comment about "LF cable impedance" above. :-)

What I'm saying is that I don't understand how any designer could ever
have a problem with a few feet of cable. I'm astounded.


What I'm not clear about is in what cases it is actually a problem. :-)
That the effect is measurable doesn't necessarily mean that. However I've
seen measured values ranging from well over 1,000 ps down to around 100 ps.

Slainte,

Jim

--
Please use the address on the audiomisc page if you wish to email me.
Electronics http://www.st-and.ac.uk/~www_pa/Scot...o/electron.htm
Armstrong Audio http://www.audiomisc.co.uk/Armstrong/armstrong.html
Audio Misc http://www.audiomisc.co.uk/index.html


Arny Krueger January 15th 11 04:27 PM

ASA and Russ Andrews again;!...
 
"Bob Latham" wrote in message

In article ,
David Looser wrote:
"Don Pearce" wrote


What I'm saying
is that I don't understand how any designer could ever
have a problem with a few feet of cable. I'm astounded.



Here here! The idea that a little bit of jitter on a
link between a digital audio source and sink should
cause measurable or audible degradation of the analogue
output is amazing. Do the designers of digital audio
devices not know how to design a decent clock recovery
circuit?


What constitutes decent? What's audible (an easy standard) or what's
measurable ( a very tough standard).

One of the more humorous ongoing running jokes in audio has been the
obsession with low FM distortion (IOW, jitter) in digital gear, and the
deification of analog gear with humungeous amounts of FM distortion.

About 3 years ago, I was reading the Arcam Forum on AV
Forums with interest in their new preamp-controller. At
that point people were waiting for their HDMI hi-def
audio. They were quite late to market and IIRC Arcam
engineers claimed this was due to serious issues with
jitter. Apparently it was much harder to get right than
with spdif due to the considerably higher bit rate. Again
IIRC there was also a claim that many other (HDMI) amps
were pretty grim in their jitter handling at that time.



Part of the problem is a lack of understanding and/or agreement about how
much jitter is too much.

In Arcam's market, it migth be reasonable in a way for them to use the way
that Stereophile measures and evaluates jitter as their standard.

AFAIK, Stereophile has been very agressive about calling modest amounts of
jitter that have failed to be heard in carefully-done listening tests, "too
much".



Don Pearce[_3_] January 15th 11 04:31 PM

ASA and Russ Andrews again;!...
 
On Sat, 15 Jan 2011 16:40:28 +0000 (GMT), Bob Latham
wrote:

In article ,
David Looser wrote:
"Don Pearce" wrote


What I'm saying
is that I don't understand how any designer could ever have a problem
with a few feet of cable. I'm astounded.



Here here! The idea that a little bit of jitter on a link between a
digital audio source and sink should cause measurable or audible
degradation of the analogue output is amazing. Do the designers of
digital audio devices not know how to design a decent clock recovery
circuit?


About 3 years ago, I was reading the Arcam Forum on AV Forums with
interest in their new preamp-controller. At that point people were waiting
for their HDMI hi-def audio. They were quite late to market and IIRC Arcam
engineers claimed this was due to serious issues with jitter. Apparently
it was much harder to get right than with spdif due to the considerably
higher bit rate. Again IIRC there was also a claim that many other (HDMI)
amps were pretty grim in their jitter handling at that time.

Bob.


My satellite link is running at about 100 Mbits/sec.

d

Don Pearce[_3_] January 15th 11 04:38 PM

ASA and Russ Andrews again;!...
 
On Sat, 15 Jan 2011 12:27:02 -0500, "Arny Krueger"
wrote:

Part of the problem is a lack of understanding and/or agreement about how
much jitter is too much.


Well, that should be easy enough. Too much is where there is
bit-ambiguity at the clocking point. To get that, you need jitter of
the order of half a bit. A stereo stream of 192kS/sec and 24 bits has
a half bit duration of 54 nanoseconds, so peak jitter should
preferably be comfortably less than that.

For a CD the figure is more like a third of a microsecond. Utterly
trivial.

d

David Looser January 15th 11 06:39 PM

ASA and Russ Andrews again;!...
 
"Don Pearce" wrote in message
...
On Sat, 15 Jan 2011 16:40:28 +0000 (GMT), Bob Latham
wrote:

In article ,
David Looser wrote:
"Don Pearce" wrote


What I'm saying
is that I don't understand how any designer could ever have a problem
with a few feet of cable. I'm astounded.



Here here! The idea that a little bit of jitter on a link between a
digital audio source and sink should cause measurable or audible
degradation of the analogue output is amazing. Do the designers of
digital audio devices not know how to design a decent clock recovery
circuit?


About 3 years ago, I was reading the Arcam Forum on AV Forums with
interest in their new preamp-controller. At that point people were waiting
for their HDMI hi-def audio. They were quite late to market and IIRC Arcam
engineers claimed this was due to serious issues with jitter. Apparently
it was much harder to get right than with spdif due to the considerably
higher bit rate. Again IIRC there was also a claim that many other (HDMI)
amps were pretty grim in their jitter handling at that time.

Bob.


My satellite link is running at about 100 Mbits/sec.


And HDMI can run a fair bit faster than that, depending on the resolution of
the video. HDMI is unusual in that it does have a separate clock line; one
of the four pairs within the HDMI cable carries clock (so in fed-up lurker's
terminology it is "clocked"). However in the context of audio over HDMI any
jitter on the audio is likely to be due to the probable non-integer
relationship between the link clock and the audio clock.

The only HDMI receiver chip I know anything about is the Sil 9135 and that
includes a crystal controlled audio clock generator. Clearly the crystal
frequency must be pulled slightly to synchronise it with the clock rate of
the audio source, but it seems to work well enough. I have many gripes about
HDMI, (particularly the crap plug & socket), but jitter on the recovered
audio clock isn't one of them.

David.





Don Pearce[_3_] January 15th 11 07:46 PM

ASA and Russ Andrews again;!...
 
On Sat, 15 Jan 2011 19:39:28 -0000, "David Looser"
wrote:

"Don Pearce" wrote in message
...
On Sat, 15 Jan 2011 16:40:28 +0000 (GMT), Bob Latham
wrote:

In article ,
David Looser wrote:
"Don Pearce" wrote

What I'm saying
is that I don't understand how any designer could ever have a problem
with a few feet of cable. I'm astounded.


Here here! The idea that a little bit of jitter on a link between a
digital audio source and sink should cause measurable or audible
degradation of the analogue output is amazing. Do the designers of
digital audio devices not know how to design a decent clock recovery
circuit?

About 3 years ago, I was reading the Arcam Forum on AV Forums with
interest in their new preamp-controller. At that point people were waiting
for their HDMI hi-def audio. They were quite late to market and IIRC Arcam
engineers claimed this was due to serious issues with jitter. Apparently
it was much harder to get right than with spdif due to the considerably
higher bit rate. Again IIRC there was also a claim that many other (HDMI)
amps were pretty grim in their jitter handling at that time.

Bob.


My satellite link is running at about 100 Mbits/sec.


And HDMI can run a fair bit faster than that, depending on the resolution of
the video. HDMI is unusual in that it does have a separate clock line; one
of the four pairs within the HDMI cable carries clock (so in fed-up lurker's
terminology it is "clocked"). However in the context of audio over HDMI any
jitter on the audio is likely to be due to the probable non-integer
relationship between the link clock and the audio clock.

The only HDMI receiver chip I know anything about is the Sil 9135 and that
includes a crystal controlled audio clock generator. Clearly the crystal
frequency must be pulled slightly to synchronise it with the clock rate of
the audio source, but it seems to work well enough. I have many gripes about
HDMI, (particularly the crap plug & socket), but jitter on the recovered
audio clock isn't one of them.

David.




Separate clock only really works for short connections. Anything
longer and it is likely to lose phase with the signal - bad news.
Clock recovery from a data stream is so easy, and guaranteed to be
timed right, that I'm surprised that HDMI uses a separate clock line.

d

David Looser January 15th 11 10:13 PM

ASA and Russ Andrews again;!...
 
"Don Pearce" wrote

Separate clock only really works for short connections. Anything
longer and it is likely to lose phase with the signal - bad news.
Clock recovery from a data stream is so easy, and guaranteed to be
timed right, that I'm surprised that HDMI uses a separate clock line.


HDMI is intended for short distances, such from a set-top-box or BD player
to a TV, and it was developed from the DVI system intended to connect a
monitor to a computer, so possibly the developers felt that using a separate
clock line made sense.

Unfortunately HDMI has ended up with no fewer than 6 separate pairs in the
cable: 3 data pairs, one clock pair, the EDID pair which allows the exchange
of information on the capabilities of the sink, and the CEC pair for
optional control signals. Oh, and there's a 5V power supply pair as well.
The result is the need for a 19-pin plug with "early" and "late" mating
contacts. A complicated thing like that is either going to be expensive, or
unreliable. The HDMI developers went for unreliable. And because it's so
small it's near-enough impossible to replace in the field. So you have to
junk your expensive HD TV when the HDMI socket fails. :-(

David.




All times are GMT. The time now is 09:09 PM.

Powered by vBulletin® Version 3.6.4
Copyright ©2000 - 2025, Jelsoft Enterprises Ltd.
SEO by vBSEO 3.0.0
Copyright ©2004-2006 AudioBanter.co.uk