Can't you pull time from RDS (FM), DAB or a LoraWAN network?
Recently i commented about a RTC test using a STM32 Nucleo-L476. I estimated the out of the box frequency offset as about 10 ppm.
Meanwhile i connected a ublox 6M GPS module to the board and implemented capture of its one second pulse output using timer 1 running from LSE with a 32768 cycle (channel 3). Then i implemented frequency measurement over spans of 65 seconds (linear regression of captured timings). The diagram shows the measured frequency offsets over about 4 hours in units of the RTC smooth calibration unit = 0.954 ppm. Average offset is 9.764 ppm with a standard deviation of 0.031 ppm.
So the clock noise can be as little as one second per year.
This means the clock is much better than 10 ppm once it gets calibrated. It even justifies using kind of delta-sigma algorithm, setting the CALR calibration register to 10 more often than to 11 for an average compensation of about 10.235 CALR units. I implemented and verified this using PB2 as 1 Hz RTC calibration output, capturing it on timer 1 channel 4.
While the MCU is off, CALR should be 10, and then with some extra time at 11 once the MCU is back on.
Also missing is temperature dependent calibration and a model for crystal aging.
Regards, Dieter
I don't pretend to understand what you did
but there are two things people use GPS time for: the UTC time (I am doing that) and the 1 second pulse (there is a market for that but I have never been involved in it).
How accurately the U-BLOX modules implement the pulses, I don't know.
The datasheet jitter spec for the M6 module 1 sec pulse is 60 nsec (€ 5.79 at ebay). Power consumption including patch antenna is 45 mA.
60 nsec is much better than the timer capture resolution (30 usec at 32768 KHz clock). The 1 sec pulse is perfect for calibrating a clock. My intention was learning how to do it and whether it's worth the effort. It is worth it, but yes, it requires some patience to implement that. Even from a metrology point of view i think it's amazing how this 10 € STM32 MCU performs.
To be honest i connected the GPS to automate adjustment of height above sea level for another application. Standard deviation with the GPS roof antenna is about 2 m.
Regards, Dieter
Meanwhile there is a 100 h log of STM32L476 LSE frequency measurements with GPS as frequency normal. Ambient temperatures are from a SHT40 sensor subject to about 1 K self heating. The log shows some trend and three "temperature sweeps" of 1 to 2 °C. These are caused by our heating system that turns off at night. Plotting the temperature sweeps one gets about 0.2 ppm/K as temperature coefficient.
In order to determine aging in a two parameter fit, more data are necessary. And maybe i can put the setup into an oven in order to see TC over a wider temperature range. That should reproduce the parabola shown in some datasheets.
Regards, Dieter
Recently i commented about a RTC test using a STM32 Nucleo-L476. I estimated the out of the box frequency offset as about 10 ppm.
Meanwhile i connected a ublox 6M GPS module to the board and implemented capture of its one second pulse output using timer 1 running from LSE with a 32768 cycle (channel 3). Then i implemented frequency measurement over spans of 65 seconds (linear regression of captured timings). The diagram shows the measured frequency offsets over about 4 hours in units of the RTC smooth calibration unit = 0.954 ppm. Average offset is 9.764 ppm with a standard deviation of 0.031 ppm.
So the clock noise can be as little as one second per year.
I don't know much about the STM32, but FYI I was running a SAMD controller off it's internal oscillator (no xtal) and marveling at how accurate and stable it was.
Uh, no.
I had the controller hooked up to USB, using USB serial to communicate. Turns out that the USB is used to monitor and adjust the internal uC clock. When I unplugged the USB connection the internal clock became pretty awful. It was in spec, no problem there, but the amazing accuracy was no longer so amazing.
In Germany we also have DCF77. The advantage with GPS is availability of cheap and easy to use receivers. To make a good DCF77 receiver with similar accuracy one probably needs a rubidium local oscillator, so it can't be competitive.
I don't have that USB confusion in my results on the STM32L476 LSE crystal oscillator. It's a free running crystal oscillator and it serves as a clock. Typical accuracy i see using the "smooth calibration" RTC feature is 0.1 sec per day and this may still improve after implementing aging and ambient temperature correction.
Depending on required accuracy one could turn on GPS maybe once per week or once per month.
Regards, Dieter
In Germany we also have DCF77. The advantage with GPS is availability of cheap and easy to use receivers. To make a good DCF77 receiver with similar accuracy one probably needs a rubidium local oscillator, so it can't be competitive.
DCF77 will never get you an accurate clock like GPS as DCF77 doesn't / can't compensate for varying propagation delay through the atmosphere and other interferences.
No, sorry, years ago i did some research and using a rubidium clock and only daytime reception i got down to 10 ** -11 using DCF77. Our lab is about 200 km from the station.
Except the receiver wasn't a small module that you order at ebay for under € 7. And there is no redundancy in case they turn it off one day in the future. I didn't propose to use DCF77 instead of GPS.
Regards, Dieter
I know that today everything is about satellites and the internet, but is there some reason why you wouldn't use WWV? It has been used for over 100 years as a world time standard.
There are several reasons why you might not want to use the WWV (and WWVH from Hawaii, and CHU from Canada, and a few others) broadcast GPS time/frequency signals:
* Propagation can be world-wide, but usually isn't.
* Even with good propagation, this is usually only on some frequencies (WWV broadcasts on 2.5, 5, 10, 15, 20 MHz), and which frequencies you can receive depends on time of day and location.
* Ionospheric propagation (skip) causes of course delay, and this delay varies hourly, daily and seasonally, causing Doppler shift of the received signals. This shift can be several Hz. Not a big deal for general time-of-day stuff, but if you need better accuracy this matters.
* Antenna size, available receivers, etc.
GPS is not without faults and vulnerabilities, but it is usually the better solution.
In the US you would use WWVB, which is the low-frequency transmission. It transmits the coded "exact" date and time every minute. You can use the $7 eBay receiver module and an Arduino decoder, so the cost would be very modest. But you still have the propagation delay problem, so the short-term accuracy isn't good. But over the long term, as long as you can receive the signal, you would always be within one second of the correct time.
No, sorry, years ago i did some research and using a rubidium clock and only daytime reception i got down to 10 ** -11 using DCF77. Our lab is about 200 km from the station.
200km is pretty close. But the 10 ** -11 is compared to what? Does the lab you work at have a direct (land line) connection to the clock driving the DCF77 transmitter for comparison purposes?
There is a german paper here:
http://cadt.de/dieter/dcf/Praezisionsfrequenzmessungen.pdfIn Figure 12 you can see the 10 ** -11. This doesn't mean there was a determination of time to 10 psec, but the clock speed (in that case the rubidium oscillator) could be calibrated to that precision using DCF77. In my tests i actually saw agreement between a GPS and DCF77 (resp. the disciplined oscillators) at the 2E-12 level. Maybe they are using a GPSDO at the DCF77 station. Nowadays we better say GNSS.
Regards, Dieter
DCF77 uses a Caesium atomic reference. It will be pretty close in frequency accuracy to GPS, with the only differences being due to special and general relativity (I assume the correction for this is done in any module for both time and position, but I do not know for sure.) GPS also has ionospheric correction via the almanac data. Of course, DCF77 is not particularly subject to relativity, but it doesn't provide any ionospheric correction, so the absolute accuracy of time information cannot be as good as GPS. But for applications outside of metrology, it's perfectly fine.
There is a german paper here: http://cadt.de/dieter/dcf/Praezisionsfrequenzmessungen.pdf
In Figure 12 you can see the 10 ** -11. This doesn't mean there was a determination of time to 10 psec, but the clock speed (in that case the rubidium oscillator) could be calibrated to that precision using DCF77. In my tests i actually saw agreement between a GPS and DCF77 (resp. the disciplined oscillators) at the 2E-12 level. Maybe they are using a GPSDO at the DCF77 station.
But as I understand it, that is frequency synchronisation only, not time. For frequency synchronisation, you can use a cell tower or FM radio station as well as these are typically synchronised to a Cesium clock or at least GPS + Rubidium.
But for this thread relativistic corrections don't matter. Also it doesn't matter whether the DCF77 station has a Caesium clock or not. And it doesn't matter if there is a millisecond delay due to atmospheric propagation over 300 km. These arguments were superfluous.
Making a good clock for an embedded system requires a way to synchronize with some other good clock and calibration of the clock frequency in order to get the correct time between synchronizations. Both is important but somehow the second aspect was lost until i wrote about the STM32L476 measurements and calibration.
Regards, Dieter
Calibrating RTC frequency, i.e. compensating clock drift dynamically, can be done for specialized applications. But for most embedded systems, picking a suitable combination of crystal and synchronization interval is sufficient.
In the context of this thread, DCF77 accuracy is certainly a non-issue. The problem really is the lack of global availability and poor reliability of off-the-shelf receivers in the presence of metal objects/structures and EMI.
For metrology applications, time synchronization using DCF77 is only accurate to ~1e-05 s even with PM receivers. This is limited by the low chip rate and the variability of LF propagation conditions. Modern commodity GPS receivers are several orders of magnitude better than that. For frequency transfer, DCF77 can more or less compete with basic GPSDOs. But there are specialized GPS-based systems that perform much better (especially using PPP). TAI, from which UTC itself is derived, is established largely using GPS transfers.
There was a nice article about DCF77 (available in english) from PTB some years ago:
https://www.ptb.de/cms/fileadmin/internet/publikationen/ptb_mitteilungen/mitt2009/Heft3/PTB-Mitteilungen_2009_Heft_3_en.pdf
Calibrating RTC frequency, i.e. compensating clock drift dynamically, can be done for specialized applications. But for most embedded systems, picking a suitable combination of crystal and synchronization interval is sufficient.
..
No, i think there is a good reason that STM32 MCUs include circuits for calibrating their RTC. As we have seen this may reduce the need for clock synchronization by a factor 100. Not everybody can ignore a factor 100.
Regards, Dieter
I think the main use-case is factory calibration, which is common-ish, particularly for applications where synchronization is not usually available (e.g. battery powered long-term data logger).
Beyond that, I'm sure you can sometimes make a case for dynamically adjusting based on synchronization error as well. Say if synchronization is only possible sporadically or very expensive (in terms of power or otherwise). It's just that for things that regularly use a network connection or cellular modem or GPS anyway, I usually wouldn't bother.
For remote battery-powered data logging, if it's really required that the timestamp be accurate, I think it's possible to use an RTC and have an extended sync interval - maybe up to a year. But you have to take into account the things that are most likely to prevent that - variations in temperature and power supply voltage.
Even if the RTC can be powered by the battery directly, with no regulator, it may make sense to use one to avoid any change in voltage as the battery discharges. So for example a 3.0V, or even 2.8V, regulator might make sense for powering from a single LiPO, or multiples in parallel.
But the big thing is temperature, particularly if "remote" means outside. The DS3231 RTC has built-in automatic temperature compensation, but I've never tested how well it works, and have not been able to make sense of the datasheet diagrams on that. But I have a setup that sets the aging register to the value that comes closest to having the RTC's 1Hz squarewave output match the frequency of GPS PPS. If I can rig it up so the GPS antenna is in the clear, I need to see if the aging solution changes in the freezer, and by how much.
But even if the built-in adjustment isn't all that great, it may be possible to apply your own formula that makes it work better. You can read the temperature from registers on the DS3231, and calculate an adjustment. I think this would have to be set up in advance for each individual part, so getting a logger ready to deploy might require a good bit of testing.
The big cost with GPS is in power. Mine runs at 45mA. So you don't want to be powering it up very often. A DCF77 or WWVB receiver would use less power, but reception would be less reliable. For most uses though, I think it's possible to rely primarily on the RTC and still have really good timekeeping.
The M6 module and its patch antenna also takes 45 mA. Using our roof antenna it becomes more like 60 mA. Ublox are advertising their M10 revision with 16 mA.
Recently we had some cold nights and the correlation plot of frequency offset over ambient temperature now exhibits the expected parabola shape. It is centered at 22.8 °C, with a 1 ppm margin of +/- 5 °C.
Regards, Dieter