That's just not how this works. Clocks don't only drift from a constant frequency offset.
Seems to me whatever the cause, it could be mitigated by this simple time-correcting solution.
If it's kept in relatively constant conditions (temperature, RH...), that would sort of be OK I guess. But in real life conditions, that's rarely the case.
I don't really see a point these days though - even a good modern, but still not expensive TCXO, which is not boutique stuff at all, can get you less than 0.5 ppm drift over a large range of conditions, which would equate to approx. 1.3 s drift over a month, without any adjustement needed. I kind of doubt the scheme you suggest could get anything close to that, and it would require more "maintenance".
Of course you can get better than this yet, but that's quickly much more expensive (OCXOs, atomic clocks, etc.) and draws a lot more power.
Overall, the idea of a "learning" process is right only 1/ if the conditions that can influence the clock freq. are stable, or at least vary in a predictable, more or less constant way, or 2/ if your learning "algorithm" can actually take into account A LOT of parameters and compensate accordingly: that would mean using temperature, probably RH (depending on the clock generator), maybe pressure, vibrations... then also take into account how the clock generator reacts to these parameters... then feed it to some kind of neural network maybe. Eek. And even with that amount of sophistication, I still doubt you could get better than even a cheap TCXO. I'd be curious to see something like this implemented though, as an exercise.