I measured how the clock frequency of a microcontroller varies with temperature.
Comments on the method and results are very appreciated.
The IC (Padauk PMS152-S16) is a one-time programmable 8-bit MCU with an internal oscillator that gets calibrated by the programmer device during flashing. In Padauk terms, I am talking about the IHRC clock (nominal 16 MHz), not ILRC. System clock is set to IHRC/4 = 4 MHz at VDD = 3.3V.
The Method:
I wrote a program that outputs a 1 KHz rectangle signal on a GPIO (in a loop taking exactly 4000 clock cycles). I soldered the IC onto a SOP16-to-DIP adaptor PCB with wires from the pads to an external connector (so I could quickly connect power & probe without touching the PCB itself). I measured the frequency with a Rigol MSO5072 and the temperature with the thermocouple of a Fluke 17B Max multimeter (advertized accuracy: 0.5C) and I manually wrote down the displayed values. The thermocouple was stuck into an unused through hole next to the IC.
First I let it sit at room temperature for a while (23.5C, same temperature the burner calibrated it) and took a few measurements. Then I put the PCB in a household freezer for several hours. I took it out carrying it in a gel ice-pack, reconnected it and left it in the ice-pack for about a minute until the temperature reading had stabilized. Then I removed the ice-pack and took measurements as quickly as I could until the PCB reached room temperature again.
Next I used a hot air gun, set to its lowest temperature (100C) and blew it against the PCB until the temperature reading stabilized at 107C. I removed the hot air and again I took measurements as quickly as possible while it cooled down back to room temperature. I repeated that cycle multiple times and also took measurements during the warming up (the idea was that in case the thermocouple warms / cools faster than the IC, I would get positive errors during warming and negative errors during cooling that would later average out).
Results and interpretation
In the temperature range -9.5C to 107C, I measured signal frequencies from 987.15 to 1008.8 Hz, corresponding to clock drift from -1.29% to +0.88%. It was easy to measure around room temperature. At extreme temperatures, it's much harder since the temperature changes very quickly as soon as the heat / cold source is removed. Certainly the experiment could be much improved by having some sort of plate / container whose temperature can be controlled and taking measurements while the temperature is not changing. Nevertheless, I was actually surprised by how closely my results map to a continuous trend line. I had expected a much higher error / noise.
I graphed my results (red datapoints). For comparison I also added the values from Padauk's datasheet (blue datapoints). See below.
We can observe a clear correlation between temperature and clock frequency. We also observe an unexpected negative offset. The deviation was supposed to be 0 at room temperature (23.5C) but I measured -0.33%. Instead it's 0 at 36C. What caused this? The burner that initially calibrates the clock supposedly has a crystal which cannot be that far off. I suspect that the soldering (which happened after clock calibration) might have had a lasting effect.
What's very surprising is that my measurements are quite different from what Padauk say in their datasheet. At -10C, I measured almost twice as much deviation as they specified. Since my datapoints are very close to a trend line, I am confident that the measurement error (introduced by my method) is quite low. Not sure how low, but it certainly does not explain the difference between my measurements and the datasheet. It would be interesting to repeat the same experiment with another IC of the same type to see if the curves differ from chip to chip.
After writing this, I discovered that the datasheet on Padauk's website (dated 2023) that I refer to above contains a very different clock drift diagram than the 2018 version of the same datasheet (still available on LCSC's website). While in 2018 Padauk specifed -1.4% deviation at -10C, in 2023 they specified -0.65%. I measured -1.29% (my chips are pre-2023). Not sure if they improved their manufacturing process and now produce chips with less drift... but this experience teaches me to take datasheets with a grain of salt.
Please let me know your thoughts.