mrflibble, you and I are on similar thought paths. I'm tinkering with VB to read the traces directly from the SA and generate correction tables. That would be nifty for initially creating the tables, but it's a big bother to need the PC to reload the values into the SA every time I want to use them -- especially if I take it out in the field as I'm likely to do.
The only limitations I know of are; 1) no more than 200 correction factors and 2) The DSA815 cannot recall any correction file if that file has more than ~50 values (the actual upper limit is somewhere between 50 and 80 values).
#1 is perfectly acceptable to me. #2 is not.
As far as sneaky variations due to the TG, that seems like a very real possibility, but I'm not sure how I could detect & compensate for that or whether they'd significantly affect the corrections. Of course, one obvious way to deal with it is to normalize through the probe, but the correction table to me seems like a much more convenient means of compensation. I'd like to have saved corrections for probes, directional couplers, etc.
One refinement I want to make to the algorithm for selecting the most meaningful corrections is this: For each value, compute the slope/intercept of a line directly between its neighbors, then calculate the point's distance tangent to that line and sort descending by that distance. The top N values are the ones that are retained for use in the table.
Wouldn't it be cool if we could write software that could run directly in the SA?
It's not gonna happen in this class of machine though.