I am in the process of repairing a very early HP 3456a meter, and I'm having one last issue before I can call this more or less completely fixed. (well OK, maybe I need to clean the front panel switches and lights as well.)
It gives a reading, but it gets an offset that after some time settles at around 7uV. Needles to say, it also fails self-test 4 due to this. I did troubleshoot a little, and using a working 3456a at the local hackerspace I can verify that the working unit is able to measure this offset to the point at TP104. This is the output of the front end gate-bias, which means that the A/D itself is in good shape. The problem may therefore be the gate-bias itself, or I rather think it is something broken in the auto-zero circuit or other front-end switching/relays. If I manually enable auto-zero after a cold start, the reading can jump quite a bit erratically every few seconds before stabilizing at the 7uV reading.
First a summary of the two things I already have fixed:
- The old MOSTEK mask ROMs were having bit-rot. this was fixed with writing the latest version of the firmware to a set of new EPROMs, using three MCM68766 to have things look nice.
- A broken LM311 in the overload-detect
The problem is that I am having some problems deducing exactly which part of the front-end is to blame. I guess it might be something which is marginally working, which makes it hard to debug any further. Not fully having a clear picture on how the auto-zero is supposed to work (I'm more into digital than analog) does not help either.
So far, I got a suggestion to probe around with freeze-spray, but is there any basic measurements I can do before resorting to that?