how do i calibrate my peak detector?
With something like a scope, spectrum analyzer or power meter with enough bandwidth so roll-off is not an issue. For example, the 1973 HP 11096A HF probe (which is basically just a germanium peak detector with a resistive divider and compensation to compensate for the non-linearity of the diode) recommends a VHF signal generator and a power meter accurate to 1% up to 480MHz. They set the signal generator terminated into 50 ohm to the desired power level with the power meter, verify that it's stable, and check if the DC voltage from the probe to be consistent with the expected voltage for that power level into 50 ohms within +/- 1dB or so. That +/- 1dB would probably not be good enough to determine a scope's frequency response.
What's the uncertainty compared to your measured signal level?
N/A.
If the difference between Vmax and Vpeak is less than the uncertainty, the results are meaningless.
the only data i have is the 1n4148 datasheet something like this where forward current vs forward voltage graph (If vs Vf) is the most sensible to me as reference (it says Vf around 0.4V to 1.2V on the extreme If). but thats "current" is difficult for me to measure during the test (and designing the circuit for I measurement without affecting the DUT). looking at my result, the voltage drop range from 0.7 to 1V, so i'm just assuming the current will be somewhere 4-100mA (attached graph from datasheet below)
This will cause error at low Vf, but uncertainty also includes for example frequency-dependence of your detector probe, variation in output amplitude of your generator, noise in your acquired signal (eg. from the generator, RFI or ADC in your scope/DMM), plus whatever I forgot
.
yes changed. i dont like it but i have to, since the generator will attenuate from ±10V at lo-freq down to ±2V at 100MHz. so in order to get decent range on voltage reading, i have to change vertical scale (from 5V/div to 1V/div)
I would try to change the generator's amplitude while keeping the frequency constant, to get a feel for the linearity of your detector as a function of the amplitude. Of course this may also be frequency dependent
.
i was asking this because i concerned that the roll-off is due to the front end circuit, that will in turn affecting the DUT signal. so, is DC signal got attenuated too?
Not sure what you mean. The input impedance will be reduced at high frequencies due to the capacitive loading, which will change loading by something like 25% at 100MHz with 10pF of capacitive loading and proper 50 ohm termination. This may change the behavior of your DUT.
let me give an example. let say we are feeding dso with two signal at CH1 and CH2. and let say the dso is rolling off to -3db @ 100MHz. CH1 is 100MHz sine ±2V (true signal from generator) and CH2 is 2V steady DC. from the dso roll-off characteristic (-3db@100MHz), the CH1 will be attenuated and displayed as sine wav ±1V, but how about CH2? will the screen show a flat line at 1V too? or at 2V (true signal un-attenuated)?
CH2 is independent of CH1 in this regard, so it will still show 2VDC. CH1 should show about +/- 1.4V by the way if it was attenuated by -3dB, since these are amplitude ratios, not power.
since attenuation is not observed in previous test. there are several possible explanation.
1) if dso is indeed rolling of at 100MHz, it means the 1n4148 performance also rolling off at the frequency, hence the voltage drop consistency observed.
2) if 1n4148 is not rolled off, so does the dso.
3) if dso is rolled off, but not 1n4148. the graph on the first post ie "expected.jpg" should be achieved, ie Vmax and Vpeak converge (come close together), ie Vdrop approaching zero at hi-freg, which is not achieved.
4) if 1n4148 is rolled off, but not the dso, Vmax and Vpeak should diverge (farther away), ie increasing Vdrop.
It's hard to tell in my opinion. You're basically dealing with a generator with unknown output amplitude as function of output frequency, a detector with unknown frequency response, and a scope with unknown frequency response. As with any calibration, you need some sort of known quantity for reference. The scope may not roll of by -3dB at 100MHz, it will likely have some extra bandwidth to compensate for manufacturing tolerances, changes with attenuator settings and changes with temperature. For example, for the TDS-220 I measured about -1 to -1.5dB at 100MHz at room temperature, depending on vertical settings.