I intend to mesure the noise of various DC calibrators. Generaly they are specified in two bands : 0,1 to 10 Hz and 10Hz to 10 Khz but almost all (except Fluke --before the 5730--) manufacturers don't explain anything about specs verification. Other manufacturers (Datron, Wavetek...) speak about average in 1 or 10 line periods (
) and exotic bands (DC 2Hz null detector...) without, too, verification process. I would like to make a systematic approach of the problem and I have several questions.
1. First question for the lowest band (0,1 to 10 Hz): Fluke 5700 manual as 5440B uses a 8520A to mesure the standard deviation of several mesurements but I don't understand the relation between the sample rate (they use max, 20 per second) and filter set at 1000 ms. User manual is very unclear speaking about analog and digital filtering enabling optimum reject of the mains period.
How can I get the same mesurement with a 3458A ? Standard deviation, no problem, reading rate per second, no problem but filter and NPLC ? And, the last but not the least, how to make the difference between the noise of the 8520A/3458A and the noise of the calibrator ? All DC serious mesurements of the 3458A use 100 NPLC...