edit: it's actually 230v + 10% and -6% in EU, not +/- 10% ... so the voltage range is 216.2v ... 253v AC but everything else remains valid.

I initially wrote the post with 210v AC minimum voltage, and redid the numbers now to consider 215v AC as minimum voltage. If i missed some edit somewhere, hope this makes it clear.

edit2 : regarding too high voltage at idle ... some will use 35v rated capacitors in such designs and a lot of capacitors will actually tolerate some voltage above their rating at such low currents, so they won't damage easily or you won't notice the capacitors swelling or going bad (right away at least) , but that doesn't change the fact that you should pick the proper voltage rating for capacitors. or at least have some mechanism to block those high voltages at idle (like maybe a zener diode that can be removed from circuit when it's not required, or a bunch of diodes in series which can be shorted out) .. but generally it's more convenient to just use the proper voltage rating.

--

That formula only approximates the capacitance required, but it's a good enough approximation.

The Vdc peak is supposed to be peak DC voltage the capacitor sees on its leads, but you should read it more like "the peak dc voltage the capacitor will see under any conditions, the safe minimum peak dc voltage"

Let's say you're in Europe where you have this :

230v 50 Hz -> transformer -> 24v AC - > bridge rectifier - > capacitor - > some DC voltage.

First of all, your 230v will not always be 230v, it's allowed to have +10% -6% . By making the standard 230v +10% -6% they were able to keep everyone happy, some countries have 230v, some 240v and some French colonies and African places have 220v and everyone can use 230v designed products in theory.

At 2-4 am in the mornings, I sometimes have 245v AC on the mains socket and that's still perfectly acceptable from the power company's point of view because it's within those 10% ... 230v +10% -6% is 216v .. 253v AC.

Your transformer will have a fixed ratio more or less, so for example in the case of 230v in <-> 24v out, that ratio is 9.58 : 1 , so with 216v.. 253v in, you're going to have 22.5v ... 26.35v AC on the output of the transformer.

So right from the start, you have to decide on the input voltages your product is more likely to meet during real world use.

For this example, let's say you're confident that your product will never see less than 215v AC and will never see more than 250v AC (because they're nice round numbers) , in which case you will know the output of your transformer will be 22.44v (let's round it up to 22.5v to keep things simple) and 26.09v (let's round this down to 26v to keep things simple)

So right from the start, our transformer is a bit more complicated, it's suddenly 215v..250v IN , 22.5v AC ... 26v AC

Now, it gets a bit messier. You see, transformers due to how they're designed, don't really output that exact AC voltage no matter what's connected to them. The smaller the transformer is, there will a higher percentage of ... the term doesn't come to mind right now ... basically at low power consumption, the transformer's output will be higher.

For example, let's say you have a 24v AC transformer rated for 10VA ... if you connect a 24v 10w incandescent light bulb to it and measure the voltage coming out of the transformer, it will probably be very close to 24v AC because the incandescent bulb consumes the rated power the transformer is rated for. However, if you disconnect the bulb or use let's say a 1w incandescent bulb, the transformer's output voltage may go up by even as much as 10-15%.

This percentage of 10-15% is common for low VA transformers but it typically gets smaller for higher VA transformers, for example a 150VA or higher transformer may only have a 5% variation.

You need to be aware of this because think of the case where your power supply idles on the desk with nothing connected on the outputs of your linear regulators. The only power consumption would be the quiescent current of the regulators (a few mA), maybe a power on led (a couple mA) and the losses in the bridge rectifier... so with just a few mA of load on the transformer, the output voltage could be quite a bit higher than the advertised voltage (what says on the label).

Here's some random examples I picked from Digikey

12VA transformer with 2 x 14v secondary windings ( Voltage Regulation:

**25%** TYP @ full load to no load ) :

http://catalog.triadmagnetics.com/Asset/FS28-420.pdf50VA transformer toroidal with 2 x 15v secondary windings ( Voltage Regulation: 12% TYP from full load to no load ) :

http://catalog.triadmagnetics.com/Asset/VPT30-1670.pdf100VA transformer toroidal with 2 x 24v secondary windings ( Voltage Regulation:

** 9%** TYP from full load to no load ) :

http://catalog.triadmagnetics.com/Asset/VPT48-2080.pdfSo let's go with 15% as a safe assumption for the transformer used.

In this case, at idle or low power consumption, our transformer could suddenly change from 22.5v AC .. 26v AC to 25.9v AC ... 29.9v AC but at high power consumption, they'll sag down to the 22.5v .. 26v AC range, depending on mains input. So really, our transformer's output range has to be considered 22.5v .. 30v AC if we must take in account all the possibilities.

Now we have the bridge rectifier, which converts the AC voltage to DC voltage, and it does this by moving the electricity through diodes. At any point in the process, there's two diodes that are active so as the AC waveform is converted to DC waveform you get a DC voltage with the peak value of :

Vdc = sqrt(2) x Vac - 2 x Vdiode where the Vdiode is the voltage drop on one of those diodes inside the bridge rectifier..

This is another small gotcha ... this voltage drop on individual diodes varies with the current going through the diodes and the temperature of the diodes.

As the temperature of the diodes goes up, the voltage drop on the diodes falls down just a bit.

Let's take some cheap and generic bridge rectifier you may harvest out of some ATX power supply or buy from the store, a 400v 25A bridge rectifier

GBJ2504 from Diodes Inc (but you'll find something similar from other manufacturers) :

https://www.diodes.com/assets/Datasheets/ds21221.pdfNow if you go in datasheet and scroll down to page 2 you will see Figure 2, Instantaneous Forward Voltage (v) , typical forward characteristics per element.

So what happens when the current is a few mA, when your circuit is idle? At let's say less than 0.1A the forward voltage per element will be less than

**0.5v** voltage drop per diode.

But at high currents, let's say at 1A the voltage drop goes up to 0.8v per diode. At more than 1A, the voltage drop only goes up by a bit, graph says around 1.1v at 10A... but let's say

**1v** for 2-3A of current or more.

So now we can calculate the peak dc voltage with the above formula at idle and at load, keeping in mind that our transformer may output anything between 22v and 30v as we discovered above :

At idle, we're using the 25.3v and 30v AC in order to account for that estimation of up to 15% increase in output:

idle : Vdc minimum = 1.4142 x 25.3v AC - 2 x 0.5v =

**34.78v DC** Vdc maximum = 1.4142 x 29.9 - 2 x 0.5v =

**41.28v**At load, the transformer's output should be closer to the rated value, so that 15% increase can be more or less ignored but we still have to account for variations in mains voltage :

load : Vdc minimum = 1.4142 x 22.5v AC - 2x1v =

**29.8v** , Vdc maximum = 1.4142 x 26v AC - 2 x 1v =

**34.77v** So what this tells you is that the peak DC voltage will be between 29.8v DC and 41.28v so you should use a capacitor rated for 50v or higher (35v rated capacitor shouldn't be used based on our estimations)

However, during regular operation, you can only rely on the peak DC voltage to be within 29.8v and 34.77v and if you want to be absolutely safe no matter the mains voltage, you should really use 29.8v as peak DC voltage in that formula. Just for super extra safety and all that, I'd probably round it down to 29.5v

Therefore, going back to the formula :

C = Current / [ 2 x Mains Frequency x (Vdc peak - Vdc minimum) ]

since we did everything above assuming 230v +/- 10% and 50Hz (Europe) and we have the safe Vdc peak of 29.5v (again, not the absolute peak, but what's going to be achievable at any time during the operation of the product) and let's say we want a minimum of 26v DC after the capacitor, now we can calculate a minimum capacitor :

I'll go with 1A of current since it makes things easy.

C = 1A / [ 2 x 50Hz x (29.5v - 26v)] = 1 / 100 x 3.5 = 1 / 350 = 0.002857 Farads or 2857uF

Now 2857 uF is not a common value, so you could go with maybe 2700uF and say "meh, close enough" but 3300uF is what I would use) .You can go higher to 3900uF or 4700uF ... more capacitance wouldn't hurt too much (too high capacitance can cause higher current spikes from the transformer when you first plug the product in the mains socket, and too high current spikes could cause your fuse to blow - that's why you usually use time delay fuses with transformers) but at 50v rating (or better) the capacitor would be bulky and expensive.

In US, with 60Hz mains frequency, you may need slightly less capacitance to account for everything.

And remember, this just sort of guarantees that your DC voltage will always be

at least 26v, but based on the math we did above, at idle or low power consumption with the data above, our DC voltage could be as high as 41.2v DC .. at just a few mA of current, such a big capacitor will easily charge and maintain the voltage much closer to the peak possible.

That's important if you're going to use voltage regulators which have maximum voltages like let's say LM1085 with a maximum of 29v input voltage. You would have to be careful about using one of these because your voltage will be anything from 26v to 35v at various loads and up to 40v at idle.

Hope it helps and it's easy to understand what I explained above.