Unless otherwise noted, AC voltage is specified in terms of the square root of the average of the squared voltage, or
"root mean square"/RMS for short. This is a useful way of specifying voltage because with many loads the amount of power applied to the load is proportional to the square of the voltage--so if you double the voltage, you quadruple the power. So by specifying the RMS voltage, you're implicitly specifying a waveform that when applied to a resistive load would result in the same power being applied as an equal DC voltage. So for a 130VAC lamp, you'd need 130VDC to run it at nominal power.
It's worth noting that for common waveforms there are simple conversion factors to determine the peak voltage from the RMS village and vice versa--for a sine wave, Vpeak = Vrms * sqrt(2). For more complex or nonrepetitive waveforms, you just have to measure a bunch of points, square them, average the squares, and then take the square root of the average. Cheap ac-reading voltmeters do the former--measuring the peak voltage, and apply a factor that assumes a pure sine wave--, more expensive 'True RMS' meters effectively do the latter.