Transformers output an AC voltage and they have a VA rating (the maximum power they can provide) . You divide that number by the AC voltage to get the maximum AC current on the secondary of the transformer.
The transformer looks like it may be rated for 20 VA but let's be optimist and say it's rated for 50VA - if that's the case, the maximum current would be 50/12 = about 4.2A
When you convert the 12v AC voltage to DC using a bridge rectifier, you get an output voltage that's always positive, and the peak DC voltage is basically 1.414 x Vac - 2 x Vdiode, where Vdiode is the voltage drop on each diode in the bridge rectifier (there's always two diodes active in the rectifier). This voltage is in the datasheet. For most rectifiers, the voltage drop is between 0.7v and 1.2v.. let's go with 1v for simplicity.
With small transformers in particular - but it's common for all transformers - at low power usage the transfomers will output more than the rated voltage... they could output 13v AC or even more instead of 12v AC ... so you have to keep that in mind when estimating the peak DC voltage. for example, in this particular case it wouldn't be wise to use capacitors rated for a maximum voltage of 16v after the bridge rectifier.
So for your 12v AC transformer, you'll get a DC output with a peak DC voltage of 1.414 x 12v AC - 2 x 1v = ~ 15v
The peak AC current can also estimated with the formula I dc = 0.62..0.7 x Iac (the constant varies usually with how big the transformer is), so by going with the 50VA / 4.2A estimation, such transformer may only be able to output 2.6A .. 3A
Anyway, you figure the peak voltage but the output will be all hills and depressions, matching the number of cycles in your mains AC input ... if you're in US where it's 60 Hz frequency, you'll have 120 hills and depressions in your DC output.
That's where the capacitors come in , after the bridge rectifier, to charge up when the output is near the peak DC voltage and fill up the depressions in the output, making the difference between minimum and maximum DC output much smaller.
There's another formula which can estimate how much capacitance you need based on what's the minimum voltage your want to have in your circuit all the time :
Capacitance (in farads) = Current (in Amps) / [ 2 x AC Mains Frequency x (Vdc peak - Vdc minimum)]
From this formula you can also go the other way around, if you know capacitance you have you can estimate the minimum DC voltage for a particular peak current.
Ayway, for our 50 VA / 4.2 A fictional transformer, and assuming you're in US where you're dealing with 60 Hz mains frequency, and you want the output voltage to always be at least 13v, then you can put the numbers in the formula :
C = 4.2 A / [ 2 x 60 x (15v - 13v)] = 4.2 / 240 = 0.0175 Farads or 17500 uF
That's the least you need to have always at least 13v at 4.2A... at lower currents voltage willl be higher than 13v.. for example at only 0.5A of power usage the capacitors could be always filled so much that you'd always have about 14.5-14.8v at the output.. and that's why you need additional voltage regulators to take this DC voltage that varies within a narrow range (13v .. 15v) to a stable 12v DC.