As part of my GPSDO project, I'm trying to use the CTMU on a dsPIC 33EP128MC202 to measure the time between two pulses. I would like to be able to measure them to about 1ns resolution, all the way down to zero (in-phase).
Sounds simple... the datasheets claim <1ns resolution and the operating principle is trivial. I cannot get consistent readings though.
For testing, I have now set up a pulse generator and delay line, so have a stable and accurate pulse separation of 0ns to 255ns in 1ns steps. My oscilloscope confirms this and the pulses are nice and sharp, nothing there that should cause any issue with the PIC.
The PIC, however, gives readings all over the place and cannot get anywhere near zero. For instance, if I set a 100ns delay, the ADC will read anywhere from 400 to 500, changing constantly, and if I drop to 1ns the PIC does not read below about 200. At zero it will read around 200 or full scale (pulses in wrong direction by a fraction). It's as if there is a huge offset, in addition to just being highly erratic. It behaved just the same when using the PPS signals. (And yes, I am discharging the sample capacitor.)
Certainly, the readings are nothing like the supposed t=V*C/I
I can post all the code and circuit (it's simple enough though) or I have it posted over at Microchip's forum, where no-one has been able to help:
https://www.microchip.com/forums/m1105541.aspxAny help please... before I give up on the CTMU as a bad job?