Strictly speaking, dB has to be power gain. Output power divided by input power gives the power gain ratio. Take the logarithm of that (base 10) and multiply by ten to get the result in dB.
However, the situation is more complex, as the definition has been perverted. One way it's been perverted is to ignore the impedance, thus destroying the meaning.
All true.
And one way it was perverted, and the way OP probably is interested in, is in using dB to refer to
voltage gain, without taking into account load resistance and therefore actual power output of the amplifier.
For this purpose, you simply assume that increasing RMS voltage of the signal 20 times increases power 20·20=400 times. So that's 52dB gain. Everybody reports voltage gain that way and virtually no one will say that the gain of your circuit is 26dB.
I think this perversion originates from pro audio, where load impedance used to be assumed 600Ω for some reason (could it be the typical impedance of input transformers in old gear which employed them? no idea). If load impedance is constant and if the amplifier is able to drive it then of course 10x voltage gain produces 100x power gain so this "perverted" rule makes sense.
Also, note that power is always positive. So it doesn't matter if the amplifier is inverting or not. In particular, -52dB gain means that output is 1/400 of the input.
For radio communications, the reference is 1 milliwatt. It's indicated by, rather than dB, dBm. In a 50 Ohm system, -73 dBm is 50 microvolts. (Also referred to as signal strength S9)
Indeed. If you want to accurately report
power output of this amplifier, you multiply output RMS voltage times output RMS current into whatever load will be connected. That's your power which you express in dB referred to watt (dBW) or milliwatt (dBm) or whatever. For example, 20V RMS into 50Ω is 8W so probably about 8dBW=38dBm or so.
If you want to report
power gain, you divide output power as above by input power going into the amp and get a unitless (just dB) gain ratio.