Hi all
Firstly, I have a habit of writing at great length, so if the following is far too wordy for anyone, here's a quick summary: I'm trying to run a basic volt meter from a 52V DC supply; the volt meter draws 16-17mA and needs 4V-30V max. I've managed to power it using 6 resistors (enough that none exceeds 250mW), as shown in simulator screenshot/link at end of post, but would appreciate any advice and suggestions on whether this is the best way, and other possible solutions.
I'm slowly learning about electronics, having started to get interested in the last couple of months. One of my first projects has been to get myself a decent bench power supply. Or rather, because I can't really afford a decent one, I'm putting one together out of cheaper components. Specifically, I have a Chinese programmable DC-DC buck converter, with constant current and voltage, LCD display, multiple memories, and serial RS232/TTL connectivity. It's quite a nice unit, able to provide 0-60V at 0-15A as long as I can feed it with a higher voltage. Therefore my second component is
an AC-DC converter, also cheap Chinese, rated at 48V/10A/500W, and adjustable from 39V - 57V. I plan to use this at 52V, feeding the buck converter and the combination therefore providing a programmable CC/CV power supply of 0-50V and 0-15A, or 0-10A at higher voltages. (I'm aware it'll be rather noisy, but I think for my beginner purposes it should be pretty good, and usefully flexible.)
The first step of my project is to do some upgrades on the
48V AC-DC converter. It's one of those cheap Chinese "LED/CCTV driver" units that has the AC input and DC output both connected via a single screw terminal block, thus positioning the 240V AC input just millimeters away from much lower voltage DC output. So I've removed that, replacing it with an IEC socket for AC input on one side of the unit, and 4mm banana plugs for DC output on the opposite side. I've also added a power switch, beefed up the heatsinks on the transistors so I can put in a quieter fan, and am replacing the tiny adjustment potentiometer with a larger one with a proper knob. That's all going fine.
For the final step of the upgrade I thought I'd throw in a cheap LED voltage display, so I can see what voltage it is putting out without always needing a multimeter. This leads me - belatedly! - to my question.
I have a small, very cheap,
three-wire LED voltmeter. Red, black and yellow wires. The voltage display can measure 0-100V, but needs between 4V and 30V to power it. So where the source is >30V, the yellow goes to the output to be measured and the red and black need to go to a lower voltage source on the same ground as the device under measurement. This is where I run into a problem, because my AC-DC converter will always be between 39V and 57V.
So my question is: what method would you recommend to drop 52V DC to a maximum of 30V to power this volt meter, which I've measured to draw a current of 15-17mA (seemingly irrespective of the voltage it's running at)?
My first thought had been that I might be able to simply avoid the problem, because the AC-DC unit has a 12V socket for the fan. There's no temperature measurement, the fan is just always on. So I thought I could take my red and black wire and run them from the 12V fan output, and then put the yellow voltage measuring wire to the positive DC output of the AC-DC supply. Sadly, while this powers the voltmeter fine, it doesn't give a useful reading. I believe this is because there's no shared ground between the 12V fan socket and the main DC output of the device, and so I always get a reading of around 1V.
Next I looked into linear voltage regulators, but all the chips I could easily find - eg L7805, L7812, L7818 and L7824 - also have a maximum voltage that's too low. Either 35V or the highest max is 40V (on the L7824). So that just redefines my problem with a slightly higher max V, rather than actually solving it. I also briefly investigated zener diodes for clamping the voltage to eg 12V, but that seemed to require far too much power on the current limiting resistor (and didn't seem to have much point given I could just use series resistance, as described below.)
Then I looked at voltage dividers, and this is where I found a working solution,
first using a circuit simulator and then on the real device. Although it didn't actually end up being a voltage divider precisely, because it seemed that just using series resistance worked better. My main issue with this was being able to drop the voltage without burning out my 1/4w resistors, so I worked out the following circuit that provided enough current and voltage drop, without any one resistor exceeding 250mW:
In this simulation I used an LED set to Vf=5V to simulate my real load. And the resistor values are chosen so as to match resistors I already owned without exceeding any power ratings.
Running this on a breadboard and measuring it, I get an RMS voltage of around 23V after the resistors, although my scope shows that it is wildly fluctuating between as low as 8V and as high as 37V. The frequency also varies, from around 500Hz - 1Khz.
It seemed to work fine but I was a bit worried about those peaks of 37V, so I then stuck an L7812 after the resistors. I don't get 12V out (because of those 8V lows I guess), but I do get a reasonably stable output of 6V: with the L7812 in place, the scope shows a Vupper of 8.2V and a Vlower of 5.9V with RMS of 6.1V. The frequency is exactly 2.7Khz, which I guess is the internal switching speed of the L7812. The volt meter works fine and measures correctly, and although it seems a bit odd using a 12V regulator to get 6V, it works and I'm happier knowing I'm not exceeding its 30V limit at any point. (An L7809 and L7805 didn't output anything - again, I suppose, because the voltage dips as low as 8V.)
The resistors get a little warm after a while running, but I think will be OK, and in the final circuit they'll be positioned reasonably close to a fan. I could always replace them with 0.5W resistors to be absolutely sure.
So I do now have a working solution. But is this the best way? Could anyone tell me how they would do it? Since I first tested it I have bought a bunch more resistors, and now I have many more 0.25W values, as well as a selection of 0.5W and 2W resistors. I did try breadboarding it with a couple of 2W resistors but they were running at up to 100°C and I figured that, even though they could take that without dying, I'd rather use a few more resistors running at low temps (and save the 2W resistors for other projects.)
Thanks in advance for any help (and apologies if my question is far too long/detailed!)