On Mon, 20 Mar 2006 18:08:54 GMT, "harrogate2"
wrote:
Could I deign to suggest that if there is really any difference when
the filter plug is in place it is because it is reducing line noise
which had been getting through the (usually) poor PSU of the CD/DVD
player and upsetting it's decoding.
This is British mains we are talking about here so no, you can't. We
are talking about someone with too much time on his hands, and a
desperate need for self validation.
I think I am right in saying that it has been found that if the
decoding error rate is reduced on a CD player by whatever means - be
that new clean disc and/or a clean lens - as it is not struggling so
much to error-correct the decoded music the end result is a subjective
improvement?
If errors are correctable, there is *no* difference. If errors are not
correctable, you get clicks and dropouts. There is no suggestion of
these. The claims suggest that digital bits are being changed in a
systematic way to affect dynamic range and other aspects of the
decoded signal. It all just beggars belief.
One thing is sure. Most power supplies have some decent sized
electrolytics in them but few have any smaller value caps in parallel
to handle the higher frequencies. The late great John Lindsay-Hood
(and I seem to think Doug Self also) produced much on this topic
decades ago.
Power supplies are just fine, and the common mode rejection ratio of
balanced amplifier stages is even better. If there were any kind of
problem possible with a balanced solid state amp, then a singled ended
valve amp plugged into the same supply would be driven to limiting by
mains-borne noise.
d
Pearce Consulting
http://www.pearce.uk.com