Seems like a simple and reliable way to measure it. Wonder why most amplifier companies don't do this.The amp design uses 1 ohm high precision resistors in series with the power tube cathodes and ground. Using Ohms Law, we know that E (voltage) = I (current) * R (resistance); therefore, thirty milliamps flowing through a 1 ohm resistor results in a 30 millivolt DC voltage drop. It is safer than measuring current directly. To measure current, one would have to insert the meter between the cathodes and ground.
I could EQ/adjust the amp to compensate for anything I didn't like in a preamp tube, for the most part, so I stuck with what sounded and felt best to me.
Exactly. The clincher for me was bumping the bias up to 35. It gave it a better feel and thickness.If it sounds and feels good, what's not to like?