Just to put a nail in this one...
Excellent information. But I still can't resist adding one more blow on that last nail.
Statements 2a and 2b above are only exactly true for lossless coax. As the length of lossy coax increases, the impedance at the source end starts to be influenced, and eventually dominated, by the coax losses. The impedance at the load end becomes less "visible" to the source.
If there's a mismatch at the antenna end of the feed line, there will be reflected power in the coax, which increases the current carried by the coax. If the coax is nearly ideal, then the statements made about matching the length are all you need to know. But real world coax is never completely lossless, and at higher frequencies, losses may be significant. Those figures the manufacture publishes about losses of "4.5 dB/100m at 150MHz" are only accurate for a good match at the antenna feed point. If you have a mismatch at the antenna, resulting in lots of circulating current, losses will be worse than the tables would predict.
That's a roundabout way of saying that if you want to radiate as much signal as possible, you really should tune the antenna for the best match practical at the antenna feed point. Perfection isn't required, but a poor match will increase losses in the coax, and this can't be completely overcome by using a tuner at the radio end of the coax, regardless of the precise length of the coax.
Since SV1JRT is talking about a short length of low-loss cable, connected to a well-matched antenna, this detail is unimportant for practical purposes in his particular situation. But in the general debate of whether coax length matters in theory, it may be something to consider.