The advice from bdunham7 is good. The units "dBm" are a power level referenced to 1 mW. To know what voltage voltage is present when indicating a power level, the load resistance must be specified. The signal generator is assuming a 50 ohm load. Most everything with RF test equipment (generators, analyzers) is assuming a 50 ohm load. The chart in the link bdunham7 showed assumes a 50 ohm load. If the load is large (like the 1 Mohm input of a scope) the voltage will be twice what is shown on the chart.
To have the signal generator display in units that are more familiar, try setting the amplitude by pressing: AMPLITUDE 500 mV (I haven't used this model, but I am guessing it will work). Instead of dBm, hopefully it will show mV (or some voltage scale). I just tried this on my HP signal generator (a different model) and the amplitude shows "499.441 mV". So if you have a 50 ohm load, that will be the RMS voltage at the output assuming a 50 ohm load. The peak output is sqrt(2) times the RMS output. So 500mV RMS would give a 707mV peak output. If the load was not 50 ohms, but a high impedance, the peak output would be twice that, 1.414V, a nice hefty signal to see on a scope. Hope this isn't too confusing.