Author Topic: Power supply regulation issue  (Read 1818 times)

0 Members and 1 Guest are viewing this topic.

Offline jp430bb

  • Contributor
  • Posts: 24
  • Country: us
Power supply regulation issue
« on: September 23, 2012, 11:08:36 pm »
For my little home lab, I picked up from eBay a used Kepco ABC series 10V/10A programmable power supply.  It had been behaving itself very well up to now, with its output voltage display matching well with my Fluke 77-IV meter with low-power circuits. 

I recently picked up a pair of 2 ohm, 100W resistors, and I've been experimenting with the power supply under some real load.  Now, if I have the two resistors connected in series to the PSU's outputs, and I ask it to supply 5V, it reads 5.000V and 1.128A, but my meter while reading the PSU's outputs is only showing 4.706V.  The ABC series is specified for 0.01% load effect on voltage, so this is way out of spec. 

I have the power supply's sense inputs strapped to its outputs. 

The manual describes how to calibrate voltage and current: first the open-circuit voltage is calibrated, and then the current through a 0.1 ohm 0.04% 50W resistor is calibrated.  My unit's voltage readout is fine open-circuit, so I'm not sure this procedure would address the problem I'm having.  I also don't know where to get such a fancy resistor for the current calibration. 

Is my power supply bad? 

Thanks,
John
 

Offline jp430bb

  • Contributor
  • Posts: 24
  • Country: us
Re: Power supply regulation issue
« Reply #1 on: September 28, 2012, 01:35:56 am »
Here's a followup.  I don't think there's anything wrong with my power supply.  It has rear screw terminals attached directly to the PCB, and there are nearby spade connectors where ~35cm wires connect and lead to the front outputs and sense connectors.  There are straps connecting the sense inputs to the outputs on the rear screw terminals, and there are also straps connecting inputs and outputs on the front panel binding posts.  There must be enough resistance in the spade terminals that a load connected to the front binding posts and drawing some current makes an appreciable voltage drop that the rear sense inputs don't see. 

I think it'll be fine if I strap the sense inputs only where the load is connected (front or rear, but not both). 
 

Offline HackedFridgeMagnet

  • Super Contributor
  • ***
  • Posts: 1974
  • Country: au
Re: Power supply regulation issue
« Reply #2 on: September 28, 2012, 03:19:19 am »
I think it is a significant problem, 6% at best you the calibration needs doing and at worst there is a problem in the calibration circuit.

This may be why it was sold on Ebay, just speculating.

Anyway just knowing about the problem is the main point, as you can always measure directly, but it would be nice to fix it.
I wouldn't personally worry too much about the resistor tolerance, find 0.1 ohm of wire that is capable of 50W for a short time, measure its exact resistance,  and use that.
Just make sure you dont leave it one too long. That should give you say 1% accuracy which would be useful for most purposes.
« Last Edit: September 28, 2012, 11:58:16 pm by HackedFridgeMagnet »
 

Online AndyC_772

  • Super Contributor
  • ***
  • Posts: 3631
  • Country: gb
  • Professional design engineer
    • Cawte Engineering | Reliable Electronics
Re: Power supply regulation issue
« Reply #3 on: September 28, 2012, 06:21:28 am »
It sounds to me as though there's nothing wrong with the PSU at all, the error is just down to wiring resistance.

Put the PSU under load and measure the voltage across the sense inputs. If it reads accurately, the PSU is doing its job and doesn't need any repair or calibration.

For heavy loads, remote sensing is necessary if you want an accurate output voltage. Can you run a wire from the sense terminals directly to the load? That should compensate for the voltage drop in the wires and give an accurate voltage at the load where it's needed.

Offline HackedFridgeMagnet

  • Super Contributor
  • ***
  • Posts: 1974
  • Country: au
Re: Power supply regulation issue
« Reply #4 on: September 29, 2012, 12:26:13 am »
I don't know why you would want to use the rear terminals, you probably need to use the front terminals.
In my opinion to make it useful you should calibrate it to give the correct voltage at the front terminals.

From your figures, if I understand this correctly:

The cable loss seems to be .294 volts
the current is 1.128 Amps
the cable length is 0.7 metres.

therefore R = 0.261 ohms. if it is just the cable giving the loss.

the ohms per metre of this cable is 0.261/0.7 = 0.372 ohms/metre.

the corresponding wire gauge is less than 0.05mm^2 or AWG 30. (assuming I haven't stuffed up the calcs)

Which is extremely unlikely for a 10A supply.

I maintain there is something wrong and to make your supply more usable I think it would be worth investigating further.

As an afterthought, would it be that there is an output protection diode in the circuit?




 


Share me

Digg  Facebook  SlashDot  Delicious  Technorati  Twitter  Google  Yahoo
Smf