Same if you work in a universtiy and do research you wish to publish, you have to know your traceable error budget.
But this thread isn't about shipping electrical products or ISO. The OP hasn't specified any requirements. I interpret this as a question as to whether it's possible to maintain reasonable calibration of test gear over longer intervals than one year by doing basic checks in a lab.
Going off topic a bit here (and maybe I'm being pedantic but...) you don't 'have' to use traceable test gear to do valid research work in RF. Accuracy and innovation aren't tied together.
Yes indeed, but at some point, you always need an accurate measurement to show that your work is valid.

That's exactly what I meant.


The N432A operates on the basis of a DC self-balancing Wheatstone bridge
which shall be elaborated on later. Hence, the N432A is the only RF power
meter with a DC substitution measurement, enabling it to convert RF power to a
DC measurement such as voltage (V) and resistance (?). This ability is why the
meter is widely used in metrology and calibration labs worldwide to calibrate
power meters and sensors, and for any application which requires accurate RF
power measurement.
Mine has been amazingly consistent for over 20 years.
QuoteYes indeed, but at some point, you always need an accurate measurement to show that your work is valid.I don't agree
Some things don't even need test gear or physical tests. Go back to the 1960s and look at a few classic patents. Some things just needed basic/classic theory and nothing else. eg the classic Sontheimer range of directional couplers. Even a student can explain how these work using a pen and paper based on theory dating back to the days of Faraday with no need to build anything let alone the need for traceable calibrations of any test gear.
The important (elegant) thing is that the RF power fed into into the sensor is the same as the DC power you input to balance the sensor. So you can effectively use a DMM on the DC range to measure RF power very accurately instead of the HP431 or HP432 meter's internal metering. So there is no need to have any 50MHz or 100MHz 0dBm reference built into the meter



How would you ensure that the reading from a second hand sensor is reasonably accurate? Do you have to compare/adjust it with a "traceable" standard?
Because these sensors are so old and so easy to damage you may find it difficult to get a working one. From what I have heard they (usually) either work or they are obviously toasted. I would expect to see lots of dead ones on ebay. You also have to get a compatible power meter to go with it and I would recommend the classic HP432A from the 1970s. Don't worry about the old school analogue dial it has. You can use a DMM on the recorder output at the back and this is going to be good enough for most users. I don't like the 432B with the dated and basic digital display and would always go for the 432A.
I did a quick return loss measurement of my 478A sensor using my VNA up to 1GHz. The plot below does say C? for cal but this is because I turned down the sweep speed after I calibrated and I experimented with different power drive levels after the calibration to see if it affected things. But the RL plot didn't alter. I'm too lazy to recal the VNA for the purpose of the plot
Note that this VNA actually has a year left of a 2 year Keysight calibration so this is one of only a few items of test gear I have with current calibration stickers. I didn't pay for the cal, it was an existing cal on the VNA and the Ecal kit.
But check out the return loss of this fabulous old power sensor...
Even at 50MHz this one is at 29dB return loss which is easily OK for my needs in terms of uncertainty when used with a decent 50MHz source with good VSWR. The HP478A and the old HP432A power meter from the 1970s are one of the most impressive pieces of RF engineering I've seen because of the simplicity and the accuracy they offer
However, for some people on here, calibration is like a religion (make your cheques payable to 'Calibration Heaven') and they chase down every $$$ avenue that gets more and more accuracy in terms of voltage and frequency etc etc. It's almost as if calibration 'is' the hobby

But if you negate the need for calibration for those who need it (for whatever reason, and as the initiator of the post), then in deed all calibration is worthless and a waste. Certainly not an engineering/scientific approach.