Hello everyone,
For some time now, I study the input protections of the multimeters and how they can influence the global precision of the measure (Precision: I talk of multimeters >= 4,5 digits).
Many topics talks about the input protections but I still have a question and don't find the an answer.
I saw several teardowns and schematics of multimeter where an PTC and an massive resistor are in serie on the voltage input to reduce the peak of current in MOV (with/without GDT).
This stage is classicaly followed by a high precision resistor network (as a caddock type 1776) to divide the voltage from 600/1000V to ~10V for the JFET input buffer.
But, why is the purpose to choose a such high precision network (with very low TCR ~30ppm) if components with strong temperature dependency (PTC, massive resistor) are put in serie?
What is the trick? What I missed?
My neuronnes will become nuts!

Thank you very much for your time

Ps: I drawed a simplified schematic on Kicad to be more concrete of an input voltage multimeter.