https://www.microwaves101.com/encyclopedias/how-to-not-trash-a-calibration-kit
This is also "all opinion". I get the fact of destroying the expensive kits and get the procedures of correctly attaching SMA connectors, however I would like to see a proof, what impact the slightly bent finger of the connector "cunt" will have on the cal kit's performance, if you will be able to measure any.
While not true proof, I remember some Agilent/Keysigth cal engineers in one of Dave's videos talking about people using poor connector systems and thinking their device was in spec, but was in fact out of spec because the poor connector was ruining the measurement.
Regarding the "one slightly bent finger": Considering you need a torque wrench to get them to spec, I can't imagine it's hard to see how a bent finger would cause issues. It is hard to get true 'evidence' since nobody want's to ruin their $10k cal kit, or the ports on their >>$100k VNA.
The question is by what amount it will be out of spec. By 0.1dB at a 5GHz? Or by what?
I have never seen a proof of the torque wrench effect on an actual measurement either. Do you know about any resources?
Please note I never stated that you should use cheap connectors. I would just like to see some real figures that would describe the problem scientifically.
I can try and do some tests with torque wrenches next time I'm working on a VNA.
The difference is also mainly in the \$S_{11}\$ and not the \$S_{21}\$.
Quick google I found this, but this talks about cables and not connectors:
https://www.tek.com/blog/improving-vna-measurement-accuracy-quality-cables-and-adaptersThis paper I found does not show the true impact of cheap connectors, but does illustrate the impact of imperfections:
Yeou-Song Lee and T. Roberts, "Accuracy study on the newly introduced Anritsu W1-connector calibration and verification kit," Conference, 2003. Fall 2003. 62nd ARFTG Microwave Measurements, 2003, pp. 109-118.
Ofcourse, it does show that this is mostly an issue at MM-wave frequencies