I am reminded of looking as SD video and wondering WTF this 4K rubbish is about
No, that's completely different to what I was talking about. I was countering your argument that "we should use 4K for everything because it might be the standard in the future".
Sure, anything unique or important, you should shoot and store in the highest resolution and best quality you are able to. I do so myself. That does not mean you should do
everything that way, just because most of the content is not worth it. And, before you upgrade your camera to a higher resolution, you should ensure that you have everything else well in hand, that it is truly the camera that is the quality bottleneck, and not something else.
Right now, because I have a 1920x1080 display, I prefer 1080p content. That may change in the future. It definitely will, if I get a larger than 22" display -- however, as I mentioned before, color reproduction etc. are more important to me, so I tend to look at the panel quality instead of resolution first. Sometimes, when I watch videos, I drop the resolution because I want to drop the bandwidth used; usually because I'm doing something else (say, downloading large software packages or whatever) and not really watching the video. It annoys the heck out of me that I cannot keep the audio quality high, while dropping the video quality. I especially like technical stuff where I can switch to looking at a very high resolution photo of the thing at hand (outside Youtube, of course), while still listening. (I do not like looking at talking heads at all, btw; they annoy me.)
I would be surprised if that kind of quality jumping is not common among other viewers with max. 20 Mbit/s net connections.
In practice, my stance means that one should consider stuff like lighting, high-res photos of the details (on a forum or a separate article associated with the video), audio quality, and so on, much before worrying about whether to use 1080p or 4K resolution right now. If you have the hardware, sure; use it. But,
first ensure that you have the lighting, mics, suitable stage (non-echoy, suitable background), still photography equipment etc. at hand, or you're putting the cart before the horse. Priorities.
Even then, the bit rate (or more precisely, compression quality, as compression efficiency varies between formats) is more important than the resolution. For example, if you have poor lighting, so your camera sensor is struggling to get the necessary dynamic range, you'll have a noisy image, no matter the bit rate. That poses a huge problem for compression, especially DCT-based ones (all we use right now are), as the compression algorithms cannot really distinguish noise components from details. So, if you use a 4K camera with poor lighting, you can get worse visual results than using a 1080p camera with good lighting.
Resolution just isn't
that important: it does not override the other stuff you need to get good visual quality.
Some think that color correction can handle lighting deficiencies, but that's just not true. Old-style fluorescent lights are notorious for this, because their spectrum is spiky and not smooth, we humans just happen to perceive it as white. However, different materials -- even skin -- reflect that light in different ways, because the light spectrum only contains spikes, instead of being continuous. It can happen that a pigment on the surface happens to absorb some particular frequencies well (absorption lines), and if those coincide with some (but not all) of the spikes in the light spectrum, you get a completely wrong color for that surface. The only way to fix that kind of errors is to
recolor the image by hand, pixel by pixel, because the original color information simply did not reach the camera sensor at all, due to poor lighting!