Hello,
I have problems understanding the maximum ripple current of an electrolytic capacitor. I thought the max ripple current is linked to the thermal destruction of the capacitor, cause the dissipated power is R*I² where R is the ESR of the capacitor.
Also I thought the power a capacitor can dissipate should be a fixed number at a given temperature for a certain capacitor.
After some calculations I got some really crazy results, my capacitor can dissipate 0.32W at 120Hz and 0.05W at 100KHz.

I picked a panasonic FR Type with 220µF and 35V (8x11,5mm)
https://industrial.panasonic.com/cdbs/www-data/pdf/RDF0000/ABA0000C1259.pdfMax Dissipation for 120Hz with datasheet values:
tan δ= 0.12 (at 120Hz).
The max ripple current is 0.95A. (100kHz)
Correction factor for Ripple Current at 120Hz = 0.7
ESR = tan δ * | X
c|
ESR = 0.12 * (1/( 2 * π *120Hz * 220µF) = 0.72 Ω
P
max = R*I
max² = 0.72 Ω * (0.7 *0.95A)² =
0.32WNow I calculated the max dissipation for 100kHz:
Correction factor for Ripple Current at 100Khz= 1.0
Total impedance is given |Z| = 0.056 Ω.
I now this isn't the ESR, cause in the equivalent circuit we have X
C and X
L in series with ESR.
X
c can be negative and therefore be subtracted of Z, which can lead to an ESR higher than |Z|. However X
c is much smaller than 0.056 Ω and can be neglected.
--> ESR ≅ 0.056 Ω
P = R*I² = 0.056 Ω * (1.0 * 0.95A)² =
0.05WSo what is wrong here? Is there an error in my calculation or an error in my logic? I cant imagine, that the capacitor can dissipate once only 0.05W and on another time 0.32W.
Thanks for your help.
