Let me see if i can help with this.
First, there are true RMS meters and there are non true RMS meters.
The true RMS meters have to one way or another perform the calculation:
Vrms=squareroot(mean(V^2))
which means they have to square the voltage:
y1=V^2
then take the mean (Tp is the period over which the integration is performed):
y2=integrate(y1,t,0,Tp)/Tp
then take the square root of that:
Vrms=sqrt(y2)
Alternately, they use a resistor as load and measure the heat the resistor develops over a reasonable time period. This is a rather old technique though i doubt any do it today.
That's how to get a true, true RMS reading.
But since RMS is RMS, you can claim the reading is RMS if you assume that the waveform is a true sine wave and if you measure either of two things:
1. Average voltage
2. Peak voltage
and then do a calculation to convert to pseudo RMS. The reading that appears on the meter will be the same as a true RMS meter, but only if the line voltage waveform remains to be a true sine wave with no distortion, and the amount of distortion amounts to a corresponding inaccuracy. Typically sine wave flat top under some appliance loads so this would not be too uncommon.
For the Average measurement, to convert to a calculated RMS value you would multiply by 1.1111.
For the peak measurement, to convert to a calculated RMS value you would multiply by 0.7071 which also means you can divide by 1.4142 .
Since dividing a voltage down is simpler than multiplying a voltage up, measuring the peak is a simple way to get to the RMS calculation because then you can either use a resistor voltage divider (two resistors) or a single resistor in series with your existing DC meter.
You first have to rectify the mains AC voltage with a diode bridge and a decent size filter capacitor value. The capacitor value should be large enough to filter the ripple so you get a decent kind of DC voltage. This places the cap voltage at around 1000uf but we can calculate a better value knowing the resistance of the meter when reading 120vdc. The voltage rating should be much higher than the line voltage, which will be 120v, so a rating of 240vdc should be good.
To use a voltage divider you would calculate two resistor values that provide a division by 1.4 which means if you have R2 on the bottom of the divider and R1 on the top, the values can be calculated as either:
R1=sqrt(2)*R2-R2
where you pick a value for R2 then calculate R1, or from:
R2=R1/(sqrt(2)-1)
where you pick a value for R1 then calculate R2.
Since R1 is the top resistor and the meter will be across R2, to ensure the proper power rating if the meter shorts out the min value for R1 if R1 is made a 1/2 watt resistor and at 120vrms would be:
R1=57600 Ohms
The max mains is a bit higher around 135vrms and that puts the value at 72900 Ohms.
You can then calculate R2. For the value of 72900 Ohms for R1 the value of R2 comes out to:
R2=175996 Ohms.
There is a catch though. The internal meter resistance will be in parallel with that R2 so you have to calculate the parallel combination to find the right value for R2. The total would be 175996 in this case, and so the parallel combination would really be:
Rp=(R2*Rm)/(R2+Rm)
and then use Rp instead of R2 as the bottom resistor.
A better way is to use a single series resistor. In this case you have to know the resistance of the meter when it is reading 120vdc because that is what it will be reading when this is done. Then R2 becomes the meter resistance alone. This is the way many volt meters are done.
As a final note, the calculations and the voltage drop of the diodes and ripple factor dont always come out exact. If you want to get closer, you have to "calibrate" the meter. This would mean using another meter of known accuracy or simply adjusting a resistor value to get the meter to read 120vdc when the full mains voltage is applied. This assumes the mains voltage is 120vac RMS at some point. If it is not, then you will still get a reading that is comparative to what the line voltage is at the time of calibration. This is often good enough as when time goes on you will still know if the line went higher or lower, and by what percentage it went up or down. For example, if you read 120vdc now and 108vdc later, you can assume the line voltage went down by 10 percent. The calibration phase can make up for a lot of different types of inaccuracies even if the wave isnt pure sine, as long as the waveform shape does not change too much over time.
Where i live we have this problem in the summer months where the line voltage can drop to as low as 100vac or even lower sometimes on the very hot days.
If you need any more information im sure someone else can help with this too or just post a message what it is you need to know and when i check the forum again later i'll reply again.
As someone else said, you do have to be very careful with this kind of voltage. All though most people just get a shock that doesnt do long term damage, there are plenty of circumstances where you can actually get killed and there's no coming back to fix the circuit after that.
Good luck to you.