View Single Post
  #16 (permalink)  
Old April 24th 06, 04:42 PM posted to alt.engineering.electrical,uk.rec.audio,rec.audio.tech
Floyd L. Davidson
external usenet poster
 
Posts: 20
Default 10 metres audio cable going into PC = too long?

Jim Lesurf wrote:
In article , Roy L. Fuchs
wrote:
On Sun, 23 Apr 2006 12:24:51 GMT, (Don Pearce) Gave
us:


No - the wanted stuff is the signal - the rest is interference. Ever
heard of signal to noise ratio? You would call it signal to signal
ratio. Now that makes much more sense, doesn't it?


Even with s/n ratio, in an engineering analysis BOTH the signal AND
the noise are signals.


As with various of the other statements I have seen in this thread on
various sub-topics, the above seems to me to be an over-simplification.
Interesting to speculate if in this case it is the above statement that is
ambiguous, or the ways in which the terms are actually used by engineers
are ambiguous... Perhaps this supports the argument that people become
engineers because they can't communicate very well... :-)


Actually the language is probably a bit *too* precise for
non-engineers... and it gets worse too, because nobody had
mentioned "distortion" until your article.

First, here are correct technical definitions, from Federal Standard
1037C, for signal, noise, and distortion. (Just be aware that they
don't necessarily mean what one might thing!)

signal:

1. Detectable transmitted energy that can be used to
carry information.

2. A time-dependent variation of a characteristic of
a physical phenomenon, used to convey information.

3. As applied to electronics, any transmitted
electrical impulse.

4. Operationally, a type of message, the text of which
consists of one or more letters, words, characters,
signal flags, visual displays, or special sounds,
with prearranged meaning and which is conveyed or
transmitted by visual, acoustical, or electrical means.

Note that it is something that "can be used to carry information",
but there is no requirement that "information" either be present or
be useful.

The energy used for AC power *is* a signal. In this thread
*all* references to hum (which clearly *does* carry information,
otherwise we would not be able to hear it and distinguish that
it as unique) and to "power line" or "AC" are always correctly
referred to as a "signal", and may or may not be a "noise"
depending on the circumstance.

noise:

1. An undesired disturbance within the frequency
band of interest; the summation of unwanted or
disturbing energy introduced into a communications
system from man-made and natural sources.

2. A disturbance that affects a signal and that may
distort the information carried by the signal.

3. Random variations of one or more characteristics
of any entity such as voltage, current, or data.

4. A random signal of known statistical properties of
amplitude, distribution, and spectral density.

5. Loosely, any disturbance tending to interfere with
the normal operation of a device or system.

Each of those definitions carries some baggage, which usually
goes unnoticed until someone gets pedantic about technical terms.

Definition 1, the most precise and restrictive definition,
requires that the disturbance be "introduced", which implies
that it originate external to the circuit itself. That is the
difference between "noise" and "distortion", when the two are
differentiated. Generally though, a distortion is a noise, but
a noise is not necessarily a distortion. (Much as a signal
might be noise, but noise is not necessarily a signal.)

Definition 2 includes the term "distort". Definitions 3 and 4
use the term "random". And definition 5 is the more commonly
used catch all term.

distortion:

1. In a system or device, any departure of the
output signal waveform from that which should
result from the input signal waveform's being
operated on by the system's specified, i.e.,
ideal, transfer function.

Note: Distortion may result from many mechanisms.
Examples include nonlinearities in the transfer
function of an active device, such as a vacuum
tube, transistor, or operational amplifier.
Distortion may also be caused by a passive
component such as a coaxial cable or optical fiber,
or by inhomogeneities, reflections, etc., in the
propagation path.

2. In start-stop teletypewriter signaling, the shifting
of the significant instants of the signal pulses
from their proper positions relative to the beginning
of the start pulse.

Note: The magnitude of the distortion is expressed in
percent of an ideal unit pulse length.

The significance of the distinction between noise and distortion
might be lost on anyone but a design engineer, or perhaps a
theoretical physicist. At a maintenance and operations level,
it makes no difference.

If you go back to some of the early sources [e.g. 1] then you can find some


Ahem, Shannon is an "early source"???? Telecommunications as we
know it today was a hundred years old by the time Shannon began
publishing! And that has only been ~60 years now. I spend many
years working on equipment that was designed before Shannon...

that describe what is observed by the receiver/destination as something
like a 'received signal' which may include some 'noise' (and some
distortion or other systematic alterations).[2]

However the sources also routinely refer to 'signal to noise' ratio.

Shannon seems to resolve this by distinguishing between the 'signal' (i.e.
what the source transmitted) and the 'received signal' (i.e. what the
destination actually observed to arrive).


Shannon does not exclude noise from being a signal. He merely
uses the proper terms to distinguish between different signals,
with the realization that we have no interest in the information
carried by some signals... :-)

So if we were to use a term like 'received signal' in the above statement
it would essentially become either a tautology or self-referential as the
signal includes the noise.


What is commonly called "Signal to Noise Ratio" is commonly more
correctly called "Signal + Noise to Noise Ratio". In
circumstances where the ratios are greater than, say, 15-20 dB
or so, it is of little importance. Hence in typical
telecommunications voice channels it is rarely considered. On
the other hand in some data circuits and when applied to noise
figures for microwave radio receivers, where the ratios are much
smaller, the fact that the signal is actually Signal + Noise is
important.

Thus the problem with the statement is that it
is unclear due to the ambiguous use of 'signal'. Hence, as often is the
case with such ambiguous statements, people start arguing about the meaning
when they are simply using different definitions which the ambiguity
allows. :-)


Ah, but ignorance on the part of some is not the fault of those
who actually *are* using the term without ambiguity. Some
posters, Don Pearce being the most obvious, have not understood
the term and have been confused, and made efforts at confusing
others.

But that doesn't mean the terms are actually ambiguous.

FWIW for the above reason, when teaching Information Theory/ Comms/
Instrumentation I tended to use another approach which is common in the
area. This is to say that a 'signal' means that the pattern (or part of the
pattern) *is used to convey information content*.


Note the difference between something that "can" and something
that "is". Also, "information" seems to be misunderstood in
that definition... if you are suggesting that "hum" is a noise
that does not contain information, which is not the case. :-)

Thus in the context of communications a 'signal' means that the sender and
destination have to have pre-agreed the coding/modulation system to be
employed, and the meanings of the code symbols or distinguishable patterns.


That would not fit the typical way the term is used in practice by
people who work in the telecommunications field.

In the context of a physical scientist making observations - e.g. an
astronomer observing what can be received from a distant radio galaxy - the
'signal means that the observed pattern will be used to obtain information
about the distant source.


Again, "can" is appropriate, but "will be" is going to cause a
misunderstanding.

The status of 'signal' then stems from the deliberation or requirement that
it conveys information on a defined basis.


That is too restrictive.

In both contexts what distinguishes 'signal' from 'noise' is the
information conveyance the 'signal' provides, and that 'noise' tends to
obscure, or limit, or make uncertain, the information recovery. This then


And it might well be the information carried by the noise signal that
makes the information from the desired signal uncertain...

helps make clear the actual meaning in practice of terms like 'signal to
noise ratio'. (Although there may then be hours of fun for all the family
as they argue about the distinction in this phrase between assuming
'signal' means either the intended/transmitted or the 'received' signal.
:-) )

Slainte,

Jim

[1] e.g. Shannon
http://cm.bell-labs.com/cm/ms/what/s...day/paper.html

Everyone who has any interest in effective communications should
study what Claude Shannon summarized. It is absolutely
fascinating to read.

[2] Probably best at this point not to start worrying about distortion as
being 'signal' or not... ;-


Can it contain information?

Distortion can *always* be counteracted by the introduction of
an "error signal" which is opposite to the distortion.
Therefore it would seem that distortion is necessarily a signal
in all cases.

--
Floyd L. Davidson http://www.apaflo.com/floyd_davidson
Ukpeagvik (Barrow, Alaska)