
February 7th 12, 02:28 PM
posted to uk.rec.audio,uk.tech.broadcast
|
|
Audio Precision System One Dual Domani Measuirement Systems
Dave Plowman (News) wrote:
In article ,
Bill Wright wrote:
I used to have a 12V DC kettle and it managed to boil two cups of water
quite quickly. It pulled about 30A!
With only 360 watts, 'quite quickly' wouldn't be how I describe it. A 3kW
kettle (rare these days) does that quickly.
I don't remember how long it took. But I only used to put enough water
in it for one or two cups (depending on it I had company) and I used to
run the engine while it boiled. It wasn't all that long, but obviously
longer than a mains kettle.
My expectations were of course limited.
Bill
|

February 7th 12, 02:52 PM
posted to sci.electronics.repair,uk.rec.audio,uk.tech.broadcast
|
|
Audio Precision System One Dual Domani Measuirement Systems
"Arny Krueger" wrote in message
...
"William Sommerwerck" wrote in message
...
BTW AM sound was always used with +ve vision modulation. I'm not sure
that
there was a killer reason why FM could not have been used with +ve
vision
modulation, but intercarrier reception (the cheap'n'easy way to receive
FM
sound with TV) wouldn't work with +ve vision modulation unless there
was
significant carrier amplitude remaining at the sync tips. Normally with
+ve
vision modulation the carrier amplitude at sync tips was nominally
zero.
Early US TV sets used separate video and audio IFs -- intercarrier had
not
been thought of at that point.
My understanding is that "inverted" polarity was used to minimize the
effects of noise bursts on the sync pulses.
That's a good part of it. The net purpose of inverted polarity was to
improve subjective dynamic range. White flecks on a grey background are
far less obvious than black ones.
Umm..No. You've both got it the wrong way round. With -ve polarity sync
pulses are more affected by noise bursts than with +ve polarity. And white
flecks are far more obvious than black. Part of the reason is that impulse
interference could greatly exceed the 100% vision carrier level, saturating
the video amplifier and, with +ve modulation, the CRT.
This was why US TVs, where -ve modulation was used from the beginning,
employed flywheel sync very early on, whilst UK TVs didn't. On the other
hand UK TVs needed peak-white limiters to prevent the CRT defocusing on to
the "whiter-than-white" interference specs.
The real benefit of -ve modulation was AGC. With -ve modulation sync tips
correspond to 100% modulation and make an easy source for the AGC bias. With
+ve modulation sync tips are at zero carrier which obviously is useless for
AGC. Instead the back-porch has to be used and many different weird and
wonderful circuits were devised to "gate out" the signal voltage during the
back porch. Due to the need to keep costs down manufacturers increasingly
turned to "mean-level AGC" in which the video signal itself was simply
low-pass filtered to form the AGC bias. This lead to receiver gain being
varied by the video content, so the black on low-key scenes was boosted
whilst the whites in high-key scenes were reduced leading to a general
greyness to everything. To me it looked awful but as the Great British
Public kept buying these sets (and they were cheaper to build) mean-level
AGC became the norm for B&W UK domestic TV receivers. One great advantage of
colour was that mean-level AGC could not be used, to give correct colour
values colour sets *had* to display a picture with a stable black-level.
David.
|

February 7th 12, 03:03 PM
posted to uk.rec.audio,uk.tech.broadcast
|
|
Audio Precision System One Dual Domani Measuirement Systems
charles wrote:
In article , David Looser
wrote:
[Snip]
The real benefit of -ve modulation was AGC. With -ve modulation sync tips
correspond to 100% modulation and make an easy source for the AGC bias.
With +ve modulation sync tips are at zero carrier which obviously is
useless for AGC. Instead the back-porch has to be used and many
different weird and wonderful circuits were devised to "gate out" the
signal voltage during the back porch. Due to the need to keep costs down
manufacturers increasingly turned to "mean-level AGC" in which the video
signal itself was simply low-pass filtered to form the AGC bias. This
lead to receiver gain being varied by the video content, so the black on
low-key scenes was boosted whilst the whites in high-key scenes were
reduced leading to a general greyness to everything. To me it looked
awful but as the Great British Public kept buying these sets (and they
were cheaper to build) mean-level AGC became the norm for B&W UK
domestic TV receivers.
I remember hunting at the "Radio Show" in 1964 for a dual standard set that
had a proper black level clamp. I succeeded, but with difficulty.
I seem to remember there was an upmarket dual-standard Murphy with the
channel buttons on the top surface that had it. Those sets were
available in a variety of cabinet colours, pink, mauve, orange, etc,
such was the gaiety of the times.
When you changed channel on those sets there was a black screen
initially, then the picture sort of faded in.
The sets had better than average sound as well.
Bill
|

February 7th 12, 04:15 PM
posted to sci.electronics.repair,uk.rec.audio,uk.tech.broadcast
|
|
Audio Precision System One Dual Domani Measuirement Systems
In message , David Looser
writes
"Arny Krueger" wrote in message
m...
"William Sommerwerck" wrote in message
...
BTW AM sound was always used with +ve vision modulation. I'm not sure
that
there was a killer reason why FM could not have been used with +ve
vision
modulation, but intercarrier reception (the cheap'n'easy way to receive
FM
sound with TV) wouldn't work with +ve vision modulation unless there
was
significant carrier amplitude remaining at the sync tips. Normally with
+ve
vision modulation the carrier amplitude at sync tips was nominally
zero.
Early US TV sets used separate video and audio IFs -- intercarrier had
not
been thought of at that point.
My understanding is that "inverted" polarity was used to minimize the
effects of noise bursts on the sync pulses.
That's a good part of it. The net purpose of inverted polarity was to
improve subjective dynamic range. White flecks on a grey background are
far less obvious than black ones.
Umm..No. You've both got it the wrong way round. With -ve polarity sync
pulses are more affected by noise bursts than with +ve polarity. And white
flecks are far more obvious than black. Part of the reason is that impulse
interference could greatly exceed the 100% vision carrier level, saturating
the video amplifier and, with +ve modulation, the CRT.
This was why US TVs, where -ve modulation was used from the beginning,
employed flywheel sync very early on, whilst UK TVs didn't. On the other
hand UK TVs needed peak-white limiters to prevent the CRT defocusing on to
the "whiter-than-white" interference specs.
The real benefit of -ve modulation was AGC. With -ve modulation sync tips
correspond to 100% modulation and make an easy source for the AGC bias. With
+ve modulation sync tips are at zero carrier which obviously is useless for
AGC. Instead the back-porch has to be used and many different weird and
wonderful circuits were devised to "gate out" the signal voltage during the
back porch. Due to the need to keep costs down manufacturers increasingly
turned to "mean-level AGC" in which the video signal itself was simply
low-pass filtered to form the AGC bias. This lead to receiver gain being
varied by the video content, so the black on low-key scenes was boosted
whilst the whites in high-key scenes were reduced leading to a general
greyness to everything. To me it looked awful but as the Great British
Public kept buying these sets (and they were cheaper to build) mean-level
AGC became the norm for B&W UK domestic TV receivers. One great advantage of
colour was that mean-level AGC could not be used, to give correct colour
values colour sets *had* to display a picture with a stable black-level.
Even with negative video modulation, it didn't seem to take the
Americans long to realise that they could cut costs by using AC coupling
in the video amplifier between the video detector and the CRT. [I've got
some old US monochrome TV circuits which definitely show AC coupling.]
As a result, the benefits of having an AGC line which didn't vary (much)
with video content would be essentially lost.
Regarding using the back porch as the signal reference, and deriving the
AGC from it, I recall a Wireless World article in around 1967,
describing a simple add-on circuit (which I made) which partly did this.
It worked both on 405 and 626-line signals. It wasn't intended to
improve the horrible mean-level AGC but, at the start of each video
line, it did clamp the video drive (to the cathode of the CRT) to the
black reference of the back porch. As a result, you still got the
contrast varying with video content (maybe not so much on 625), but at
least the black stayed (more-or-less) black.
--
Ian
|

February 7th 12, 04:17 PM
posted to uk.rec.audio,uk.tech.broadcast
|
|
Audio Precision System One Dual Domani Measuirement Systems
In message , Bill Wright
writes
I seem to remember there was an upmarket dual-standard Murphy with the
channel buttons on the top surface that had it. Those sets were
available in a variety of cabinet colours, pink, mauve, orange, etc,
such was the gaiety of the times.
I remember them, I'd never seen anything like them before or since.
--
Clive
|

February 7th 12, 04:17 PM
posted to uk.rec.audio,uk.tech.broadcast
|
|
Audio Precision System One Dual Domani Measuirement Systems
In article , Bill Wright
wrote:
charles wrote:
In article , David Looser
wrote:
[Snip]
The real benefit of -ve modulation was AGC. With -ve modulation sync
tips correspond to 100% modulation and make an easy source for the AGC
bias. With +ve modulation sync tips are at zero carrier which
obviously is useless for AGC. Instead the back-porch has to be used
and many different weird and wonderful circuits were devised to "gate
out" the signal voltage during the back porch. Due to the need to keep
costs down manufacturers increasingly turned to "mean-level AGC" in
which the video signal itself was simply low-pass filtered to form the
AGC bias. This lead to receiver gain being varied by the video
content, so the black on low-key scenes was boosted whilst the whites
in high-key scenes were reduced leading to a general greyness to
everything. To me it looked awful but as the Great British Public kept
buying these sets (and they were cheaper to build) mean-level AGC
became the norm for B&W UK domestic TV receivers.
I remember hunting at the "Radio Show" in 1964 for a dual standard set
that had a proper black level clamp. I succeeded, but with difficulty.
I seem to remember there was an upmarket dual-standard Murphy with the
channel buttons on the top surface that had it. Those sets were
available in a variety of cabinet colours, pink, mauve, orange, etc,
such was the gaiety of the times.
When you changed channel on those sets there was a black screen
initially, then the picture sort of faded in.
The sets had better than average sound as well.
I think that I bought an Ekco. 4 preset buttons for VHF and another 4 for
uhf with a change=over switch between.
--
From KT24
Using a RISC OS computer running v5.16
|

February 7th 12, 04:36 PM
posted to sci.electronics.repair,uk.rec.audio,uk.tech.broadcast
|
|
Audio Precision System One Dual Domani Measuirement Systems
In message ,
Terry Casey writes
In article ,
says...
On 2/6/12 1:16 PM, David Looser wrote:
The original plan, drawn up in the early '60s, was to re-engineer Bands 1
and 3 for 625-line operation once the 405-line service was switched
off; but
it never happened. I guess that the powers that be thought that the
spectrum
could be more usefully used for other purposes.
Of course it could, but harmonizing spectrum with the continent might
have been beneficial as well. Have these plans been published?
I don't see how we could harmonize system I channels with the French 919
line channels!
Other western European countries[1] used system B in a 7MHz channel
width and system G in an 8MHz channel at UHF.
To use the same channels we would have needed to devise a system X with
a truncated vestigial side-band to fit our 6MHz sound-vision spacing
into 7MHz - in reality, I don't think it would have fitted!
Of course, both the British and the Irish could have simply adopted the
European systems B and G (5.5MHz sound-vision - plus the horrendous
group delay pre-correction curve). If I remember correctly, the only
difference between systems B and G is the 7 vs 8 MHz channel spacing.
Even the VSBs are the same (0.75MHz).
In practice, if we had decided to carry on using VHF for 625 line
broadcasting, I think we would have harmonised with the Irish 8MHz
channel plan - not least because of the proximity of NI transmitters to
those in the republic.
Again, IIRC, the RoI VHF 625-line channels were the same frequencies as
the 'lettered' 625-line channels already used on many VHF cable TV
systems.
[1] Belgium also had its own variant of the French 819 line system
crammed into a standard 7MHz channel - it must have looked truly
appalling in comparison to 625!
I think that these had gone well before I got involved!
--
Ian
|

February 7th 12, 04:40 PM
posted to uk.rec.audio,uk.tech.broadcast
|
|
Audio Precision System One Dual Domani Measuirement Systems
In message , Arny Krueger
writes
"Ian Jackson" wrote in message
news
In message , Arny Krueger
writes
"Dave Plowman (News)" wrote in message
...
In article ,
Bill Wright wrote:
I used to have a 12V DC kettle and it managed to boil two cups of water
quite quickly. It pulled about 30A!
With only 360 watts, 'quite quickly' wouldn't be how I describe it. A
3kW
kettle (rare these days) does that quickly.
How long before electric kettles have switchmode power supplies?
Lots of options including charging up a local cap or battery and use it to
speed the boiling process well beyond what can be done with just line
power.
Just think of the amount of RF hash that could be created by a cheap,
badly-designed 3kW switchmode power supply!
OTOH, well-designed ones are becoming very common.
The existing rubbish generated by PLT, plasma TVs, arcing thermostats etc
would pale into insignificance.
I have two power amps with ca. 2 KW switchmode power supplies. They
manage -100 dB total spurious responses, with their switchmode parts only
inches from the input terminals.
But I bet they cost a bit more than your average £15/£25 domestic
kettle!
--
Ian
|
Thread Tools |
|
Display Modes |
Linear Mode
|
|