Author Topic: Calibrated RF power sensor reference cal factor question  (Read 253 times)

0 Members and 1 Guest are viewing this topic.

Offline route20Topic starter

  • Newbie
  • !
  • Posts: 3
  • Country: us
Calibrated RF power sensor reference cal factor question
« on: April 06, 2023, 07:23:17 pm »
I'm setting up a modest RF lab at home and purchased a 437B meter with freshly calibrated 8482A sensor. The vendor (who also performs the calibration) provides a current cal factor table for the sensor. The printed reference cal factor is 99% (suggesting the cal factor at 50MHz is 99%). The printed cal factors for 30MHz and 100MHz are 98.9% and 98.7%, respectively. These interpolate to 98.8% at 50MHz. Shouldn't the reference cal factor be 98.8% in this case? Or am I misunderstanding something? I use (way newer) gear like this at work but those are calibrated by the metrology guys and the data stored in memory in the heads themselves, so I rarely have to think about the cal factors.

Yes, it's a small amount and the reading would be in spec either way (off by 0.01dB/0.002mW), I'm mostly just trying to understand the why of it.
 

Offline alm

  • Super Contributor
  • ***
  • Posts: 2898
  • Country: 00
Re: Calibrated RF power sensor reference cal factor question
« Reply #1 on: April 06, 2023, 08:12:05 pm »
I've seen this before where the cal factor for 50 MHz is not between 30 Mhz and 100 MHz, or on any kind of smooth line through the other points. I guess it could be uncertainty in the calibration (the measurement uncertainty of the 8482A is in the 1.2% ballpark), or maybe the calibration at 50 MHz is done another way than the other frequencies? Either way, I'd use the reference call factor (99%) to adjust the sensor to the 50 MHz reference.


Share me

Digg  Facebook  SlashDot  Delicious  Technorati  Twitter  Google  Yahoo
Smf