Hi
Ok, let's start by seeing if we can divide the area down a bit:
Back when video first came out, a "video signal" was an analog gizmo that often was used to modulate an RF carrier. The days of putting that sort of signal on the air are drawing to a close in the commercial end of things. If that is your area of interest, they divide down into several major groups (generally by country) and get a bit more complex when color (yes I'm from the US) gets added to the signal.
Once things like PC's came along, the signals began to get less "one waveform does it all" and more oriented to signals for each color (there's that spelling again) plus possibly signals for sync. The stuff was still analog and shared a lot with the earlier signals. Your VGA (and many of its predecessors) fall into this category.
The digital age ultimately caught up with video and the whole idea of analog signals pretty much went away. The "modern" approach is to send data over a serial bus. It may be a multi wire serial bus, but it's still moving data. It's no different than a RS-232 or USB signal in that it's all ones and zeros. HDMI is a good example of this. The interface to a flat panel display is a very different example.
So why does this matter?
Well, once it's "just data" your analysis is a bit different than when it's a bunch of analog this and that. Data protocols are a more general topic (that you dig into first) and video specific stuff is a sub-section of that. If your intent is to learn for the world of the future, skip to the page on digital and go from there.
If the "old stuff" (as an old guy I can say that) has interest, by all means dig into it. Pick *one* of the sections and dive into that. Don't try to learn all of them at once. Get familiar with one system and then add to that.
So, which way do you want to go?
Bob