Hello all! I'm hitting a snag on a personal project and I was hoping someone with a fresher head or a better understanding can help.
I'm doing an LPC810-based video circuit that is mostly generic (just a resistor network as DAC), but uses a lot of the built-in technology to save on pincount. By this I mean that the line timing is interrupt driven, the clock line is via the internal IRC, and the color subcarrier is managed by an integrated PWM block.
This is really not optimal, so I went to great care to make sure the timing was accurate (before you ask, I never turn the PWM off, I just disable the output at pin level to make sure phase coherence is kept):
So, this is the colorburst (top is source PWM, bottom is generated signal):


The H-sync is also up to standard and doesn't seem to shift at all:

~
So, when all it's said and done, why does the video signal normally looks like this (sorry about the squashed video, I only captured half a frame)?
https://www.youtube.com/embed/XzwraHwkl_QHere's a frame capture with some apparent color but quick flickers are more the norm:

My living room LCD proves to be more lenient and gives a bit more faithful reproduction, so I'm adding a shot as a comparison (but it will also flicker quite a bit and go BW every now and then):

I've tried:
- changing the colorburst amplitude
- changing the colorburst phase relative to signal color
- leaving it on all the time
- more variations on the subcarrier pulse than I can remember
- changing the V-Sync timings (they might not still be spec accurate, but are enough to ensure carrier is not drifting across fields/frames)
but nothing ever seems to never be enough to give color 100% of the time.
I know the simpler answer is "use a crystal/dedicated encoder IC" but I'd rather know _why_ does this implementation doesn't work. I feel like I'm missing an important factor here about how color is detected/regenerated.