In article , David Pitt
wrote:
Jim Lesurf wrote:
done, and the user knows the snags, etc. Yet with computer it tends to
be done without the user being knowing, or being told how the process
is being computed.
In a similar vein, this about Mac OS makes a good read, the author seems
to know what he is talking about.
http://scopeboy.com/scopeblog/?tag=resampling
The article is interesting. But it also indicates the problems.
Firstly the basic error of having a system which imposes resampling when it
may simply not be necessary. If you output to a DAC that can cope with a
range of sample rates then the default should be to send it the source rate
samples. Not impose a needless change with the chances that will degrade
the info.
Secondly, the above gives a link to what is described as if it were a
program, but is actually just sets of coefficient values for use in a
resampling process. No real details of how the process is done.
The problem is then as follows. Is it really the case that every new
resampled value is computed using around 3000 coefficient x input sample
multiply and adds? If so that seems an insanely over demanding method. That
kind of thing is OK for dedicated hardware but sheer blind brute force and
iggorance for a CPU in a general computer *when you could have left the
data alone*.
And what level of accuracy will you then get when using single precision.
How accurate are the coefficients give they aren't in floating format, nor
binary, etc?
So it does look to me like another example of where those working on
general computing just don't understand the problem, and apply irrelevant,
needless, or less than optimum processes.
Slainte,
Jim
--
Please use the address on the audiomisc page if you wish to email me.
Electronics
http://www.st-and.ac.uk/~www_pa/Scot...o/electron.htm
Armstrong Audio
http://www.audiomisc.co.uk/Armstrong/armstrong.html
Audio Misc
http://www.audiomisc.co.uk/index.html