by EWBrown » Sun Feb 01, 2009 3:11 pm
Those "oddball" 15.6 and 11.2 ohm resistors in the original Dynacos were chosen, because the correct bias settings would produce a reading of 1.56 volts, which was exactly what a fresh carbon-zinc D cell would produce.
In both the ST70 and the MKIII, the output tubes were connected in pairs, to the cathode measuring resistor.
So, the ST70, with its EL34s, when set to 50 mA each, would read through the resistor as 1.56VDC, which equated to 100 mA total. This mandated having a matched pair, which was supplied with the kit.
Similarly, the MKIII, had the single 11.2 ohm resistor, once again the bias was set for a 1.56V reading, and that equated to 70 mA per output tube.
This eliminated the need for an expensive precision voltmeter, just measure the battery, make a temporary mark on the meter, and then set the bias to match the mark.
This was about 2-3 decades before digital voltmeters were available (except as very expensive laboratory equipment).
Today, when decent DVMs are common and cheap, it is easier to replace these "weird" resistors with 10 ohms 1% W resistors, and then read the idle current directly, 100 mA would read as 1.00 VDC, and 140 mA would then read as 1.40 VDC .
/ed B
Real Radios Glow in the Dark