Sorry for not posting in a while. Exams...
I sometimes solder in IC socket pins to where I expect to have to change component values later.
I get them from chopping up machined pin IC sockets. Or they can be bought in SIL form.
I tried a couple values and ended up using 320 K, mainly because I didn't have an easy (no more than two resistors) way to make 400 K.
The good SIL socket strips that are .1 inch on center have an insert that effectively grabs the component lead and makes a very good contact. They are made so you can easily snip off the number you need.
I have used those before. They are really nice, although a bit more expensive than the standard ones (which I have a lot of) that you have to take one pin off to cut them to length.
I have worked a bit more on the code, and updated the calibration constants so it is reasonably accurate in setting and measuring voltage and current. Everything works as expected and I find the interface easy to use. Constant current mode is indicated by all decimal points showing up on the current display, I saw this on a comercial power supply on this forum somewhere. See first two images.
So here is the problem:
While I was working on the Constant Current mode detection, which is done by a 1/2 voltage divider on the CC op-amp's output, i noticed the decimal points flashing. Attached is a scope screenshot of the output of said divider going to the microcontroller. That is with no load at all and max current set to 100 mA. The "oscillation" goes away when I set the max current to around 300 mA and then starts again going down to 30 mA or so. It seems to work fine drawing some current trough a resistor, but it sometimes oscilates again. I would expect this right on the edge of current limiting, but not drawing 0-30 mA with constant current set to 100 mA.
I went and measured the current sense amplifier's output, same conditions as first measurement, which is the second scope screenshot. Any ideas why this happens?
Juan.