I have a tool that makes measurements of a linear variable differential transformer (LVDT). (In short its primary is driven at a fixed frequency and amplitude, and the secondary's output is proportional to the position of the movable core.) The signal is generated relative to the regulated power supply (e.g. if the supply voltage increases, the excitation amplitude increases) and the measurement is done as an absolute read of the secondary voltage. Thus, the circuit is sensitive to changes in the regulated supply.
It currently uses an LM317 and LM337 that are trimmed to supply +5V and -5V respectively. I am interested in switching to a 7805 and 7905 instead so I can do away with the trimming.
I understand I can get better absolute accuracy with the LM3x7 regulators—I can trim them to within hundredths of a volt and they'll hold that value. The absolute accuracy isn't particularly important: it's the consistency over time and temperature.
These tools get used in a variety of human environments and are specified accurate from 5C to 40C. Current draw is about ±30mA, so heat sinking is not a factor. Two factors that are at play are drift over time, and temperature sensitivity.
As best I can tell from the data sheets, both regulators are about the same with regards to temperature. Is this true?
The LM78xx and LM79xx specify an absolute accuracy of about ±5%. But will it change over time? In other words, if I'm getting 5.07V from a 7805, a year from now will it be 5.07V or some other value between 4.8V and 5.2V?
TI's LM317 data sheet specifies a typical value of ±0.3%/1K hours, so after a year, that could be as much as 3% or so, and after 10 years, 30% which seems downright absurd.
My guess is the regulators are pretty stable over time: more so than the data sheets imply.
(And yeah, the RIGHT answer is to either use a precision reference to drive the primary of the LVDT or make a ratiometric calculation.)