I have a question, namely, I know a TRL calibration requires the line standard to be very close to Zo characteristic impedance (most systems use 50ohms = Zo) and have a phase delay of 20 degrees to 160 degrees phase delay through the line standard. The optimum phase shift is 90 degrees to get the best calibration. I'm assuming that the thru standard is a zero-length direct coupling of both ports.
My question is this: If I attempt to use this line to calibrate across a broad frequency range, I'm wondering if I'll get a usable quality calibration wherever the phase delay is n*90 deg +-70deg where n is odd, e.g. 1, 3, 5 etc..? Most TLR examples cover the case for n=1 only. What if I try n=3 case? Will the "upper band" calibration quality be acceptable if the n=1 band is a good calibration?
Why am I interested in this? Because I want to do a TRL calibration so I can determine the calibration coefficients for a set of existing SOLT calibration standards via measuring them with the TRL calibrated vna. These coefficients basically require one to perform polynomial fits of order 3 or 4 to fit the S-parameters of the open, short, and load standards. I think I can do the required polynomial fits over the n=1 (20-160deg) to n=3 (270deg+-70deg) bands if I ignore the "bad" frequency regions when I do the polynomial fits to get the calibration coefficients of the standards. I'm thinking that the "bad" frequencies won't really be needed to determine the fitting coefficients for the polynomials?
I will likely just use weighting to place more emphasis on the frequencies very near 90deg and 270 deg phase shifts and taper off the weights as the frequencies deviate from the optimum.
Do you think this will work to give me a high-quality set of calibration coefficients?