I will post here my version of analog mains voltmeter inspired by an article in Elektor.
Mains voltage is transformed to 9-12V AC which is used for measuring and for powering the meter. Any meter you have lying around the lab that is sensitive enough - 50uA to 10mA can be used. Circuit is simple and it has two parts. The first one is voltage reference and buffer - LM7805 and U2:A. Voltage regulator 7805 can't sink current and because of that the OPAMP is used. Diode D1 prevents current from flowing in the opposite direction and protects the meter during the power off. Diode voltage drop is compensated by connecting the feedback after it. The second part is a voltage divider - potentiometer P2 and a voltage buffer U2:B which drives the meter. So, at one side of the meter you have reference +5V and on the other you have a voltage greater than that and corresponds with the mains voltage. Calibration is done using the two potentiometers P2 for adjusting the voltage difference and P1 for limiting current trough the meter. Adjusting is very simple by using the variac, but I will also describe a way to calibrate it without it, because only few hobbyist have it at hand.
Calibration with variac: set P1 to the max resistance. Set the input mains voltage to the 195V. Then set the P2 so the current trough the meter is 0uA. Then set P1 to the min and fine readjust the P2. Now, set the P1 to the max and variac to the 245V and slowly adjust P1 to get full scale meter deflection (100uA in this example). To check the calibration, set variac to the 220V and you should read a half of meter deflection (50uA).
Calibration without the variac: measure mains voltage Um and voltage on C1 - Udc. Now calculate ratio N = Um/Udc example: 219V/12.00V = 18.250 AC V/DC V
Now calculate the scale limit values:
U(195V)=195V / N = 195V / 18.250 = 10.684V (minimum voltage)
U(220V)=220V / N = 220V / 18.250 = 12.054V (normal, center voltage)
U(245V)=245V / N = 245V / 18.250 = 13.424V (maximum voltage)
If the mains voltage is 195V then the voltage on C1 is 10.648V and so on. Now, calibration is done by using variable power supply. Disconnect meter from the mains and connect power supply to the C1. Set it to the 10.648V, P1 on max and adjust P2 for "0" meter deflection. Set P1 to min to get the higher sensitivity and readjust if needed. Then set P1 to max and PSU to 13.424V and adjust P1 to get the full meter deflection. To check the calibration, set PSU to 12.054V and you should get 220V reading on the meter or the half scale deflection.
Change the value of P1 according to your meter range if needed. If you want to build this, keep in mind that mains voltage is dangerous and can harm or kill you, so use caution and put the meter in plastic box. Transformer is 9-15V with power rating of several W. You can use wall adapters that put out enough high and unregulated voltage (with transformer, not switchers), AC or DC... For other normal mains voltages (120V, 230V,240V), just draw a different scale on the meter and use adequate adapter/transformer. Meter needle shows even the 0.5V mains voltage change at 220V.
Attached you can find the schematic, picture of the meter scale and voltmeter in box hanged over my workbench alongside the regular analog voltmeter and a big green indicator light that lights up when I flip the mains switch on my bench. So, no more left on and forgotten soldering iron and so on...