I often see texts that imply or state outright that ringing in time-domain
audio waveforms, usually associated with time-domain discontinuities,
is to be avoided. I wonder what is the basis for this.
For example, if you take a perfect square wave with all of its harmonics,
and mathematically remove all of those above a certain point you will
see a waveform with ringing. I think this is referred to as Gibbs'
Phenomenon.
However far from being a problem, assuming you do have to
band-limit a signal, the presence of ringing (in this case, anyway)
seems to show:
- perfect removal of all frequencies above a certain point; and
- perfectly preserved amplitude and phase relationships amongst the
remaining frequency components.
As long as a filter perserves all human-audible frequencies I cannot see
an objection (to this form of ringing, at least). The only argument
I can think of for objecting is the possibility that the non-linear
behaviour of the ear may result in specific audibility issues which
wouldn't be heard with fully linear hearing. Indeed, I do see
articles that cast doubt on the audibility of ringing. For example
http://www.stereophile.com/reference/106ringing/.
So is ringing a bad thing per se? Or are there specific forms that
cause problems?
--
John Phillips