I have a problem with the voltage and current display board in a Circuit Specialists switch-mode bench power supply, model CSI3003SM. I cannot get the zero offset of the voltage readout set correctly, and I think the problem is a bug in the firmware of the PIC16F676 that appears to run the display. I know this sounds pretty obscure. Has anyone else seen the same problem? Or even spent any time adjusting the meter readouts and figured out any more than I about the operation of the pushbutton on the back of the meter board?
First, some background: The CSI3003SM is an inexpensive Chinese-made 30 V 3 A switching power supply. It appears to actually be made by a company named QJE, based on the label on a circuit board. From their outward appearance, it seems that the supply is also sold as the QJE PS3003, MPJA 9616PS, and other names.
I bought it used. Sometimes it seemed to work OK, but sometimes there was a problem with setting the maximum current, and sometimes there was no output at all (and the voltage- and current-setting pots had no effect). After some exploration, I discovered a cold solder joint on one end of a wire jumper on the potentiometer board, in the current-setting circuit. Once I fixed that, the voltage and current setting pots operate normally.
Then there was a problem with the voltage/current display: the current display read 0.38 A when there was no current output, and the voltmeter read too low, particularly at low voltage. So I tried to recalibrate the meters. I expected to see one offset adjustment pot (to set zero) and one gain adjustment pot for each of the two channels (voltage and current). Instead, there were two pots and a pushbutton as the only apparent inputs on the meter board. (I have attached a photo of the rear of the display board).
A bit of experimentation showed that the two pots are analog gain control for the voltage and current readouts. RP1, on the right in the picture, adjusts the current measurement scale factor. RP2, on the left, adjusts voltage measurement scale factor. But how do I calibrate the zero points on the readout? The pushbutton is the only remaining control. (One of the 4 legs of the pushbutton had not been soldered; I fixed that.)
I believe that the meter board is basically software driven. The 14-pin chip at the left side of the display board is a PIC16F676 microcontroller. The 16FJ676 contains a multichannel 10-bit A/D converter, which has more than enough resolution to drive the voltage and current displays, which read 0-300 or a bit more (0-500 in the model with 5 A output). If you're already using a PIC to read voltages, it makes sense to remove any DC offset using digital subtraction of offsets measured in-circuit by the PIC itself, instead of adding two more pots and several more resistors to remove the offset in the analog circuitry.
So I guessed that you would set the power supply for zero voltage and current output, press the button, and that would tell the PIC to read the two channels (perhaps reading several times and averaging). Then it would store the two offset values in the on-board EEPROM. From then on, every time voltage and current were measured, the two digital offsets would be subtracted, thus setting the zero points of the meters correctly.
I tried it. A quick press on the button did nothing. Holding down the button for several seconds did produce a result - the current display instantly changed to 0.00. And ever since then, the current zero point seems to be more or less correct.
But voltage is a different story. The first time I tried this, I hadn't thought about the need for the quantities being measured to actually be zero, so the output voltage was about 1.2 V at the time I pressed the button. It suddenly changed to 00.0, so it started subtracting 1.2 V from all of the voltage readings before display. And using the pushbutton, with the output voltage set to zero, does not change the 1.2 V offset. On the other hand, I can still increase the offset! If I adjust the supply output to read 0.5 V on the meter, with the 1.2 V offset still active, the actual output voltage is 1.7 V on an external meter. Using the zero-set pushbutton returns the voltage display to zero, but from this point on the offset is 1.7 V. Now the meter on the power supply reads 1.7 V below the actual voltage.
I think this must be a firmware bug. The current zero correction seems able to be set on demand, but the voltage correction can only be increased, not decreased. This is the sort of thing that might never be noticed in production if the meter circuit has the zero set only once during manufacturing. The saved offsets in EEPROM are likely set to zero when the PIC is first programmed, and the first "zero set" operation sets the values to what they should be. It is possible that nobody (except myself) ever pressed the button a second time, when the output voltage was above zero, and so no one else has ever been affected by the problem. But now I have a voltmeter that reads about 2 V too low, on a power supply which otherwise works fine. Which annoys me.
So: Has anyone else encountered this and figured out a workaround? Is there a magic pushbutton sequence which clears the offset values to let me start from scratch?
The only thing I can think of trying at this point is that there is a 5-pin connector in the lower-left corner of the board which is unused. It might be a programming connector for the PIC. If so, I might be able to dig out my old PIC programmer and see if I can read either the flash (code) or EEPROM (data) memories on the PIC. Then I might be able to reset the offset values. Or, if I can read the code memory, disassemble it and fix the bug. (All of which seems like a lot of work for what probably isn't a very wonderful power supply).