I'm setting up a modest RF lab at home and purchased a 437B meter with freshly calibrated 8482A sensor. The vendor (who also performs the calibration) provides a current cal factor table for the sensor. The printed reference cal factor is 99% (suggesting the cal factor at 50MHz is 99%). The printed cal factors for 30MHz and 100MHz are 98.9% and 98.7%, respectively. These interpolate to 98.8% at 50MHz. Shouldn't the reference cal factor be 98.8% in this case? Or am I misunderstanding something? I use (way newer) gear like this at work but those are calibrated by the metrology guys and the data stored in memory in the heads themselves, so I rarely have to think about the cal factors.
Yes, it's a small amount and the reading would be in spec either way (off by 0.01dB/0.002mW), I'm mostly just trying to understand the why of it.