Well, it was designed for a certain price. I am not sure if the hardware has enough resources to support it (code space, processing power...).
It's not really a "horsepower" limitation as the calibration only needs to be calculated once when you are doing it.
You're solving for a matrix, that you're going to multiply by your measured data matrix, to give the actual result.
There's definitely more code involved in allowing definable cal sets, but it's all math that you only calculate once at cal.
( I wouldn't expect code space to be a problem these days.)
Once it's done, you're doing the same number of calculations every sweep regardless of the cal type.
Bigger point? Normally I would cal these out. I normally have cables attached and measure on the other end.
Yeah, but often at the other end of your cable you might have other sex or other gender connectors than the supplied cal standards.
Like when measuring at the end of a several foot SMA to N cable. Best case, you're using a decent adapter than only adds half an inch of phase shift and has a 30 dB RL.
Quite likely you're doing worse.
VNAs are about the best example of "garbage in, garbage out" for test equipment.
I don't see anything else better for the price, but I'm leaning towards LibreVNA if I ever want upgrade from my NanoVNA.
It seems to have much closer to a normal VNA functionality.
The NanoVNA reminds me more of something like an Anritsu Sitemaster than a full VNA.
I would think the best way to cover cal for super low cost VNAs would be to allow known s-parameter cal for the datasets.
The manufacturer can take 1 minute and measure the standards on a traceable VNA and then provide those files with the standards. No need to try and make them perfect SOLT. Just use decent repeatable connectors and stable resistors.
Many of us will have access to a lab-grade VNA and it would be easy to measure a handful of parts as home standards.