I'm wondering if you can help me understand how lab power supplies treat sinking current.
Some background: I do a lot of lithium ion battery testing. When I'm done with a battery, I want to set it to 3.6-3.7V for storage. As I'm sure you're aware fully charged li-ion batteries are 4.2V, and fully discharged are 2.5-3V (they don't like being at these extremes for a long time, which is why I want to set them to 3.6ish). Sometimes I charge the battery before deciding I'm actually done with it, whereas sometimes I don't.
So I was trying to think of a simple way to do this and ideally have one device that goes both ways. I was thinking I could just get a few battery holders and wire each positive contact to the positive of a power supply with a power resistor in between in series (we'll say around 1 ohm). Wire the negative of the power supply to the negative of the battery. Set the power supply to 3.65V.
Then:
Scenario A) If the battery is discharged, it will be 2.6-3V, the power supply will be 3.65V, the power supply will charge the battery. If I have a 1 ohm resistor inline that will mean it'll be about 1A at first then slowly dwindle down as it gets closer to 3.65V and it will asymptotically approach 3.65V. This part seems clear cut to me and everything will work great as far as I know up to the max current of the supply (5A in this case). I'm sure there are more elegant ways to do it but this should work.
Scenario B) If the battery is charged, it will be 4.2V and the power supply will be 3.65V, so the power supply will see negative current provided by the battery. Is this a problem? Is there a limit to how much current I can do, from the power supply's perspective? Any input on how much that limit probably is?
The power supply I'm using is this one:
http://www.sra-solder.com/korad-kd3005d-precision-variable-adjustable-30v-5a-dc-linear-power-supply-digital-regulated-lab-gradeAny thoughts would be appreciated.