Cables -The Antepenultimate Answer.
Menno wrote:
Eiron wrote:
Shock result - interconnects sound the same!
As an interconnect test I soldered up a short adapter (4cm) with a
male phono, two female phonos and a jack plug.
This allowed me to take a mono source and feed it directly to the
left channel of a sound card, while also sending it through the
interconnect under test to the right channel of the sound card.
The source was Donna Summer's 'Bad Reputation', which has a lot of
treble.
I would have used Abba's 'The Day Before You Came' but I get a bit
emotional listening to that which might spoil the results.
Using Goldwave and comparing the source directly with the source
passed through the oldest, cheapest interconnect in my junkbox
revealed that the difference peaked at -57.9dB. After taking the
difference and maximising it, it sounded like the original, with more
noise and a grainy effect to the treble.
If anyone can hear anything at 58dB below the music level he must have
better ears than me.
I haven't yet calibrated the sound card, i.e. balanced the two
channels
but these preliminary results suggest that the squirrel is nuts.
What differences did you see? Did you also look at phase plots or just
amplitude? My experience is that the audible difference in interlinks is
more in the spatial information than the attenuation of high frequenies. So
with better interlinks you get more of the stereo image embedded in the
recording. More depth and better precision in where sounds come from.
Why don't you try it yourself and see?
On a second look at the results, the signal through the crappy cable was
down by 0.1% or 0.01dB. So by using Goldwave and boosting the level a bit,
the difference was down to -65dB, with noise at about -70dB, and the voice
barely distinguishable.
Which still shows that a crappy 1 metre interconnect makes no audible difference.
--
Eiron
There's something scary about stupidity made coherent - Tom Stoppard.
|