Long post, I spend a lot of words trying to describe my exact issue, but the overall question is right at the end if you wanna skip the read.
For the past couple of hours, I have been trying to get my simulation of a PLL based FM transmitter to work in a desirable way. I'm using the CD4046 as the PLL, I'm using it's pin 2 as the phase detection output, and I'm using a common collector colpitts oscillator with a varactor as the VCO. The loop filter between CD4046 pin2 and the input to my VCO is causing me some grief, I'm sure it's due to my own lack of understanding. A schematic is attached.
For the loop filter, I am currently trying to use a simple RC filter. The cutoff frequency of the filter is currently about one tenth of the frequency of the pulses coming out of pin 2 (around a 60 khz square wave). After some time, this setup almost always locks up the PLL as desired, with a characteristic sawtooth waveform on the output of the loop filter as shown in the CD4046B datasheet in figure 3. This locked waveform on node 'vco' is in one of the attachments. The obviously phase locked input signals are shown in another attachment.
The problem I now have is that I have to do a lot of frequency dividing to get the output frequency I want from the reference frequencies that are available. This means that the sawtooth waveform (about 60 khz) is happening much slower than the output of my VCO is oscillating (about 100MHz). The result of this is that the center frequency of the VCO output is indeed what it is expected to be (about 101.5MHz), but there is a TON of phase noise around that center frequency due to the huge ramping signal being fed into the VCO at 60 khz (duh). A picture of the FFT of the output of the VCO is in one of the attachments, you can obviously see the unacceptable amounts of phase noise. Note that when I just power the VCO from a DC source (no PLL stuff), its output frequency is absolutely specific (bandwidth of less than 1 kHZ as opposed to several MHz)
Now, to me, the obvious solution is to reduce the cutoff frequency of the RC loop filter (make R and/or C bigger). This would average out the pulses from the phase detector more, making the sawtooth waveform have a smaller amplitude, and decreasing its impact on the phase noise of the output of the VCO. However, Upon trying this in simulation, lowering the cutoff frequency lead to to unstable behavior, and the PLL no longer locks. The output of the loop filter gets into an oscillating steady state (shown in "steady state oscillation"). It seems like the changes on the output of the loop filter just aren't fast enough to counteract the difference in the reference and VCO frequencies and the PLL gets into the situation of chasing its own tail (shown in "PLL chasing its own tail")...
So, with all that out of the way, I would like to know the following:
Why does the cutoff frequency of the loop filter in a PLL have a minimum allowable value to retain stability, at least according to my sims? Why can't it be arbitrarily low (like 10 Hz cutoff frequency) and still lock eventually? Is this the issue being described on page 3-125 in the CD4046 datasheet in the frequency capture section? How do I reduce the amplitude of that triangle wave but maintain a frequency lock?!
Side questions:
Any recommendations on resources for PLL loop filter design for this type of phase detector?
Whats the benefit of using pin 13 on the CD4046 as the phase comparator instead of pin 2?
I divide the VCO frequency by about N = 1600. Is this division just impractically large and doomed to fail no matter what?
You're a hero if you made it through this whole post. Thank you so much!