Measuring the actual power level from your amplifier playing music is complicated, since the waveform is far from sinusoidal and real music (not test tones) varies dramatically in level due to rhythm, etc., while you are trying to measure it. The volume control on your amplifier is not a useful guide, and the "audio taper" may not be accurate.
Here is a suggestion if you want to get quantitative:
A normal CD player has a well-defined maximum output voltage, due to the discrete levels in the DAC. I've found that manufacturers sometimes make it difficult to find this value. If you need to measure it, the NAB has a test CD available with specified levels on different tracks:
https://www.nabstore.com/NAB_Broadcast_Audio_Test_CD_Vol_1_p/cp400.htm (price $20, but currently out of stock). My CD player has a full output level of 2 V rms on the normal analog outputs, which is typical.
Now, you need to calibrate your volume control. If you have a stepped-attenuator (or a detented potentiometer), you only need to count clicks, otherwise you need to improvise a dial. Do these tests at a low output level, well below the amplifier maximum, to avoid damage. You need to connect an audio generator (set to a convenient frequency such as 1000 Hz) to the input and a load resistor (4 or 8 ohms, according to taste) to the output and measure the input and output voltages as a function of volume-control setting with a good AC voltmeter (true-rms is nice, but not required, since we are dealing with low-distortion sine waves). As you decrease the volume control (increased attenuation), you will need to increase the generator voltage to keep the output voltage at an easily-measurable level. The
mean (
not RMS) output power in this steady-state measurement is the
V2/
R, where
V is the RMS voltage across the resistor
R.
Now, when you listen to a favorite CD, you have a method to judge the amplifier power that gives you the sound level you want for your music.