In article , Ian Molton
wrote:
On Sat, 24 Jan 2004 09:42:02 +0000 (GMT) Jim Lesurf
wrote:
*IF* the receiver uses the timing of the 'edges' of the waveform as it
clock indication the result is that the apparent clock will then
jitter with the data pattern *even if the source had a perfect clock*.
Ugh. surely no-one tries to use the raw waveform as the clock?
Probably not, in simple terms. However the problem is that the source clock
is 'buried' in the S/PDIF stream, so that stream is what you're given and
then you have to devise a suitable way to extract a suitable clock. Hence
you can apply careful design, but you have the problem to deal with if
receiving S/PDIF.
Surely if you design a good atomic clock you have a stable frequency
reference, and so all you need to do is build two clocks and adjust the
phase?
The snag with atomic clocks is that you are using line frequencies that
were chosen precisely on the basis that nothing much affects the frequency
you get. Hence tweaking their output can be awkward. This is a consequence
of something Stewart has pointed out. VCOs tend to have more phase noise
than 'fixed' oscillators. The tuning mechanism will have noise, and this
then ends up noise-modulating the frequency.
Hence you're more likely to build two clocks, get them stable, and then
calibrate their phase/frequency differences by comparing their outputs over
a long enough period. In general you can count cycles as well, so you can
break the comparison for a while and come back later.
What strategies do they use to synch atomic clocks?
See above. It tends to be more a process of comparisions and
cross-calibrations. Not up on this, but I think the best modern clocks use
things like ion traps with laser 'cooling' of the atomic population.
However this isn't field, so I'm just trying to recall tea-time
conversations I've had with people over the years! :-)
Or, more to the point, whats the highest frequency they can work with
now - I assume the degree of synchronisation defines the effective
precision of the baseline, and thus the minimum wavelength they can
reliably use for interferometry ?
Don't know the current state of the art. However I think people now do few
hundred GHz arrays over the order of a km, and lower frequencies over much
wider baselines. One of the problems here, though is data bandwidth, not
the center frequency. Hence it isn't so much the center frequency you're
using that may be the limit, but the tendency for astronomers to want GHz's
of data bandwidth. If you drop the bandwidth you can often go to a longer
baseline. But if you drop the bandwidth you end up with less information
from what you are trying to look at. Pays yer money and takes yer choice...
:-)
There the baselines are typically a few thousand km.
I had heard GPS could be used as a frequency reference for this - but
have no idea how good it is, beyond knowing its good enough to have to
care about relatavistic effects.
I suspect that GPS would have too poor a short-term phase stability for
'direct' use due to atmospheric propagation errors. However they may use it
for medium and long term reference checks on the 'local' clock used at any
antenna station. Countries like the USA, UK, etc transmit clock signals by
radio as well for these purposes. Rugby MSF is the one most people know
about and the long term accuracy of this should be good. There are loads of
these all around the world, though, often at a few MHz IIRC.
However I'm not in this area, so you'd need an up-to-date radioastronomer
to say what methods they use in detail these days.
Slainte,
Jim
--
Electronics
http://www.st-and.ac.uk/~www_pa/Scot...o/electron.htm
Audio Misc
http://www.st-and.demon.co.uk/AudioMisc/index.html
Armstrong Audio
http://www.st-and.demon.co.uk/Audio/armstrong.html
Barbirolli Soc.
http://www.st-and.demon.co.uk/JBSoc/JBSoc.html