Author Topic: interested in hobbyist voltage reference standards?  (Read 33165 times)

0 Members and 1 Guest are viewing this topic.

Offline Rhythmtech

  • Regular Contributor
  • *
  • Posts: 189
Re: interested in hobbyist voltage reference standards?
« Reply #25 on: June 19, 2010, 03:11:58 am »
At some point it might just be easier to design your own divider (it's probably just a bunch of resistors, although some care might be taken to decrease thermal gradients and temperature coefficients), or search for a divider designed for the impedance of DMM's. I doubt that you could get this kind of accuracy from an op-amp without tricks (what will you use as feedback resistors?). Note that both accuracy at the base temperature (23C or so) and the temperature coefficient are important.

Of course you don't have to exceed the original specs, but it might be a waste to use a very accurate divider in that case. You can also order some 0.1% resistors for under a dollar each.

I follow what you are saying. I think any which way I cut it I would be back to needing precision multimeter to calibrate the reference circuit. Oh well for ebay fare...
 

Offline saturation

  • Super Contributor
  • ***
  • Posts: 4788
  • Country: us
  • Doveryai, no proveryai
    • NIST
Re: interested in hobbyist voltage reference standards?
« Reply #26 on: June 19, 2010, 12:38:45 pm »
... if you want lab grade calibration, but if you're on a budget and not a lab, the Geller style approach is cost effective, IMHO.

If you know someone with a calibrated meter, as mentioned earlier, you can check your lab gear against it and take that 'calibrated' meter home to check your others.



At some point it might just be easier to design your own divider (it's probably just a bunch of resistors, although some care might be taken to decrease thermal gradients and temperature coefficients), or search for a divider designed for the impedance of DMM's. I doubt that you could get this kind of accuracy from an op-amp without tricks (what will you use as feedback resistors?). Note that both accuracy at the base temperature (23C or so) and the temperature coefficient are important.

Of course you don't have to exceed the original specs, but it might be a waste to use a very accurate divider in that case. You can also order some 0.1% resistors for under a dollar each.

I follow what you are saying. I think any which way I cut it I would be back to needing precision multimeter to calibrate the reference circuit. Oh well for ebay fare...

Best Wishes,

 Saturation
 

alm

  • Guest
Re: interested in hobbyist voltage reference standards?
« Reply #27 on: June 19, 2010, 02:18:59 pm »
The principle of calibration is that you compare your equipment to a known quantity, ideally a standard that's traceable to NIST (or similar) standards. So anything you use should be calibrated in some way. The disadvantage with buying just a voltage standard is that you get one DC voltage, calibrating the lower scales needs something like a Kelvin-Varley divider (or just a Kelvin divider) with very precise resistors, calibrating the higher scales is harder. You can't do resistance or AC. Resistance could be done with really precise resistors (ideally standard resistors), I'd probably skip AC entirely (except for basic performance checks), since it's really hard to do a proper AC calibration, especially if the meter has a fairly wide bandwidth. These often need something like a 300V, 300kHz source with .1% precision (needs a really expensive Fluke AC calibrator). I usually get by with a function generator and mains by comparing several meters.

For performance checks, buying several precision meters might be more cost-effective than buying standards, depending on the price. I didn't pay much more (and sometimes even less) for some of my 5.5+ digit meters, and the advantage is that you can verify DC, AC, resistance and AC/DC current, and can be used as extra meters. Most of them are within specs or really close. Any precision meter also contains a precision standard. You won't get them calibrated for $10, though.
 

Offline Rhythmtech

  • Regular Contributor
  • *
  • Posts: 189
Re: interested in hobbyist voltage reference standards?
« Reply #28 on: June 19, 2010, 03:51:21 pm »
The principle of calibration is that you compare your equipment to a known quantity, ideally a standard that's traceable to NIST (or similar) standards. So anything you use should be calibrated in some way. The disadvantage with buying just a voltage standard is that you get one DC voltage, calibrating the lower scales needs something like a Kelvin-Varley divider (or just a Kelvin divider) with very precise resistors, calibrating the higher scales is harder. You can't do resistance or AC. Resistance could be done with really precise resistors (ideally standard resistors), I'd probably skip AC entirely (except for basic performance checks), since it's really hard to do a proper AC calibration, especially if the meter has a fairly wide bandwidth. These often need something like a 300V, 300kHz source with .1% precision (needs a really expensive Fluke AC calibrator). I usually get by with a function generator and mains by comparing several meters.

For performance checks, buying several precision meters might be more cost-effective than buying standards, depending on the price. I didn't pay much more (and sometimes even less) for some of my 5.5+ digit meters, and the advantage is that you can verify DC, AC, resistance and AC/DC current, and can be used as extra meters. Most of them are within specs or really close. Any precision meter also contains a precision standard. You won't get them calibrated for $10, though.

I understand your point. This thread is centered around hobbyist voltage reference standards. Calibration contains a measure of duality in its meaning. The first being adjustment and the second the idea of traceability and standardization.  For example, a multimeter may be calibrated before leaving the factory but in order to avoid the added costs certifying the calibration it is done without traceability to NIST. In other words the new hire off the street can calibrate the meter because meter calibration is automated on the production line and minimal training is required to put the meter in the calibration fixture, computer runs through values, makes changes, set multipliers -  this would be adjustment.  The production plant would not require strict environmental control, and the data every meter produced does not have to be filed away for X number of years. The end user is given the choice to calibrate the new meter through an accredited lab as soon as they receive it, which is how traceability would be established. Calibration for the military or development purposes by manufacturers is done in order to standardize parts compliance. eg - My 6 amp rated motor will not end up a 4 amp motor on your systems. The fuse blows as 10 amps for everyone not 12 for some and 8 for others.  My inches are based on King George the 30th's foot and yours are on King George the 15th's etc....


The idea I believe we are pursuing is to find inexpensive ways to build or buy references for simple DC values that I and I assume many of us use most frequently at home.  It matters less to know that 600Vac at 400 Hz is correct on my meter than the DC ranges I use most often.

What I most commonly run in to:
-15  to +30 Volts
0  to 10 amps
.01 to 10 MOhms
 0 to 100kHz


Temp coefficient, hysteresis, and the accuracy all still matter, but to a lesser degree for a home check since I can't pay for mucho nice equipment.  In other words it would be nice to buy a meter off ebay, run through some values with your references, and say gee this was a good deal or not. Seems like it wouldn't be overly difficult to build or buy a few things to have references in those ranges, but I keep proving myself wrong.  Questioning a meter that is calibrated is wise because it could still have a problem, so having a quick reference is still not a bad idea.
« Last Edit: June 19, 2010, 03:58:32 pm by Rhythmtech »
 

alm

  • Guest
Re: interested in hobbyist voltage reference standards?
« Reply #29 on: June 19, 2010, 05:30:19 pm »
I wasn't suggested that calibrations for hobby use should be NIST traceable or to buy a Fluke 55xx/57xx calibrator, but that you always need some sort of external reference for calibration (unless you're into applied physics and use fundamental standards like the Josephson junction). Calibration is useless without some reference to the outside world. This does not have to be a cal lab, if you buy .1% resistors from a reputable vendor, make sure not to over stress them (don't get them hot), you can be reasonably sure that it's within .1% + tempco. But if it's something more complex than a resistive divider at DC, I would want to verify it before using it to verify my own equipment (chicken and egg problem). Something like the Geller standard is a way around it, since it's calibrated to their (calibrated-ish) equipment, but only for DC voltage at that range.

The way I verify used equipment (all of my expensive equipment is used, since I don't have the money to buy as nice equipment new) is to compare it with multiple of my existing meters (which were calibrated to NIST standards at some point). I don't invest in really accurate sources, since fixed 10V source has few applications outside calibration. So I use reasonably stable general purpose sources, like lab power supplies, function generators and 1% or better resistors (preferably large ones to reduce heating). They don't have to be accurate, but stable for a minute or so. For example, I compared three meters with about 3V from a lab supply, they read 3.0071, 3.00722 and 3.00730 volts. The 1 year accuracy of the last is 90ppm of reading plus 20ppm of range (2 digits), so if it were in cal, the real value would be between 3.00704 and 3.00758 volts. According to the other meters, the real value was between 3.00652 and 3.00772 volts, and between 3.00709 and 3.00737 volts. The real value could be between 3.00709 and 3.00737 volts, and all meters would be in spec. I arrive at this supposed real value by taking the highest lower limit and the lower upper limit. If the highest lower limit is higher than the lowest upper limit, my readings conflict, and something's out of spec. It's likely that the less accurate meters in this comparison are within 1 year specs, and confirms that the most accurate meter is not wildly out of spec. I was trying to verify the first, least accurate, meter, and found it in spec on all ranges except for the zero adjustment on the ohms range (fixed that myself). I can't prove anything this way, but assuming the drift of meters from different vendors, ages and places is fairly random, they either drifted the same amount or didn't drift much. There's no way that my lab supply is this accurate, but it's stable enough (I connect all meters in parallel and take a reading at the same time). I do this for all ranges according to the performance verification procedure, at least as far as I can. I can't generate AC signals above 20Vp-p or so, so I only check the high-voltage AC ranges at a low voltage (under 10% of their range, but it tells me the divider is working) and at 50Hz by connecting them to mains. I can't generate DC voltages above 90V or so (by connecting all lab supplies in series), so I check the 1000VDC with only that voltage.

This is what I do to see if my latest Ebay buy was a good one. If it's pretty close, I consider it a calibration issue, and will try to calibrate it myself (for not so precise equipment, I wouldn't try to adjust something to <100ppm with a lab supply) or send it out. If it's wildly out of cal, I consider it a defect, although I recently received a meter that always read 10V on the 30VAC range. This turned out to be a calibration issue, the meter just stored an offset of 10V and a gain of zero, probably because someone connected it to 10V when it asked for 3V and 30V.

Almost all of my current meters are within their one year spec, even though they were calibrated years ago. As long as the new equipment agrees within their specs, I consider it as probably in spec.
 

Offline Rhythmtech

  • Regular Contributor
  • *
  • Posts: 189
Re: interested in hobbyist voltage reference standards?
« Reply #30 on: June 21, 2010, 07:56:27 pm »
I am in complete agreement with you, the method you use is much better than not doing anything at all and allows you at least gain some measure of judgment over your equipment's condition.

I have a tendency to try to make things work that would be more easily replaced with a better solution.  I agree with you, the 1 Mohm divider is not as good a solution as perhaps a couple of .1% resistors of lower impedance arranged as a divider would be a better reduced accuracy solution.   

I am not after a magical component that is an infinitely accurate solution for $1, however for the sake of discussion it is good to know what can be done for some simple measurements - precision resistors, precision zeners, etc...  Eventually hobbyist may find value in NIST type traceable calibrations, especially if a few people decided to work on a project via a forum or online collaboration and needed to compare results or build upon previous data from others.
 

alm

  • Guest
Re: interested in hobbyist voltage reference standards?
« Reply #31 on: June 22, 2010, 03:07:33 am »
I'm all for finding cheap ways of getting something done, I wouldn't recommend any hobbyist to buy a Fluke calibrator (unless you get it cheaply). Those calibrator are necessary for a commercial cal lab for throughput reasons, a hobbyist can make do with more involved methods. If you would want to go that way, I'd stick to buying used dividers and standards. Equipment from the past might work just as well as modern equipment, just less accurate (don't expect to be able to calibrate a Fluke 8508A with <1ppm accuracy), and slower in use (no Met/cal support). But unless you do it for hobby or go through lots of meters (repair/trade), it's unlikely to be cheaper than those $100-200 every once in a while for a commercial calibration.

But the issue with precision measurement is that nothing is easy, those NIST guys aren't spending all that money on bureaucracy ;). All kind of interesting effects play a role when you're going for those last few .001%. Personally, a .01% reference doesn't interest me, since almost all my precision meters agree within at least that spec. But I'm sure there are plenty of people with just one 3.5/4.5 digit meter, in that case a 0.01% or even a 1% reference might have value.

I wouldn't write off that Tektronix reference, just point out that you want to do an analysis before spending lots of time and effort in building something you can't check. There's no proof that that old reference didn't drift in those 50-60 years or so. On the other hand, by now it's probably pretty well aged and stable. They specify 1Mohm +/- 1%, for a total accuracy of 0.01%. To me this suggests that a 1% error in load impedance probably represents no more than .001% or so in divider ratio, since you don't want that impedance error to eat up all your margin of error (gotta leave some room for manufacturing tolerances). That means that it would have an output impedance of about 1kohm. A normal 1:10 divider with that impedance might be a 1.11111111k +/-0.005% + 10k +/-0.005% divider. To compensate for that 1Mohm load, the 1.1111111k resistor might be 1.1123471k. Changing the 1Mohm to 10Mohm, the bottom resistor would be 1.11122233k. The division ratio would be 0.1000901 +/- 0.01% (plus the meter input impedance tolerance / 10000), which is a .09% error. Totally swamps the original .01% tolerance, but you might be able to use it as a stable 1:9.991 divider or so (I would want to verify it first and measure the divider to get the real resistor value instead of my guesses). You would probably be able to get an OK accuracy with a stable 1.11111Mohm +/- 1% in parallel with the meter, if the meter has a stable 10Mohm impedance, and not 11Mohm or 10Gohm. Of course my guess about the internal circuit might be completely wrong. Whether $25 + shipping + time + compensation resistor is worth it is up to the individual buyer.
 

Offline Rhythmtech

  • Regular Contributor
  • *
  • Posts: 189
Re: interested in hobbyist voltage reference standards?
« Reply #32 on: June 22, 2010, 05:45:46 pm »
I follow how the divider error causes a Heisenberg type effect in the measurement and the importance of knowing what you put in vs what you are reading. Not certain where our opinions might have diverged anymore. Kind of a nice feeling  ;)
 

Offline NiHaoMike

  • Super Contributor
  • ***
  • Posts: 5847
  • Country: us
  • "Don't turn it on - Take it apart!"
    • Facebook Page
Cryptocurrency has taught me to love math and at the same time be baffled by it.

Cryptocurrency lesson 0: Altcoins and Bitcoin are not the same thing.
 

Offline Joe Geller

  • Contributor
  • Posts: 24
Re: interested in hobbyist voltage reference standards?
« Reply #34 on: July 12, 2010, 05:47:26 am »
>>I've looked at the Geller labs reference before, and it's "only" accurate to 0.005% over 6 months.  The new more expensive Malone reference is can do better, 0.0025%, but both of these are still more then double what the Fluke 8505A is rated for.

"I suppose what you are saying is keep sending them back for recalibration (or at least remeasuring with their 8.5digit calibrated multimeters) in order to hide their drift over time.  Yeah, I guess that would work."

Since the comment quotes our published specification, it is entirely unclear what is being hidden.  

But, far more importantly, the statement above completely misses the point of our product.  What we are trying to do is to supply a single point calibration reference that is more accurate than is generally available to an amateur scientist or electronics hobbyist.  Our SVR boards are not voltage standards, they are transfer references.  For many hobbyists there are more suitable products, you should buy the product of your choice, I only write here to better explain what we are trying to accomplish.  This program generally operates at a loss, there is no "got you" here to make money.

The goal of our project is to transfer the absolute voltage from our Fluke 732B, a "transfer standard", to the experimenter.  Our Fluke 732B is calibrated by Fluke, usually annually (we skipped a recent period for economic concerns, it goes back this month, possibly tomorrow).  A "calibrated" Agilent 3458A -alone- is not even good enough for 10 ppm and below.  As others have noted, while time & frequency calibration to parts in 10 to the 12 (1e-12) is common now in amateur labs, absolute voltage calibration below 10 ppm and especially at 1 ppm is difficult.  So, even though our 3458A was new out of the box (many thousands) in 2005, it cannot hold better than about 10 ppm / year.  And, most 3458A users don't realize the importance of the auto cal feature just to hold that value (at the ppm levels, auto cal needs to be used every day as well as for for every one degree c change of room temperature).  That is why we only use the 3458A to finally calibrate boards based on a short term measurement of the 732B 10V output.  

Our short term promise is better than 10 ppm absolute, which is better than 0.001% (actually our spec is better than +/- .0005% absolute (+/- 5 ppm)).  Most test runs do better.  Since the chips are rated at 5 ppm / c (50 uV/c), our SVR board should be used at about the temperature where we calibrated for best transfer accuracy.  Also, the board should be warmed up for about 30 to 60 minutes, and run within about 0.1 V of 15V (however, the chips perform much better than the spec for power supply sensitivity (100 uV/V).  At a relatively low level of precision (e.g. 0.01%) all of these concerns are a non-issue.

To some, all these factors are taken as deceitful, however that is exactly the opposite of what is intended.  The point is, is that it is very difficult to attain transfer accuracy between 10 and 1 ppm with a board that costs $35 new.  That includes parts and hardly reflects assembly and calibration cost or overhead.

The 6 month spec is based on AD's 15 ppm / 1,000 hours.  However, it is very conservative, since we burn in all of our SVR boards for 200 hours plus.  AD reports that most of the drift occurs in the first several hundred hours, see this ADI tech note: http://www.analog.com/static/imported-files/application_notes/301548125AN-713_0.pdf .

At over $500 / year (includes overnight shipping on battery) to maintain our Fluke 732B (not including a new battery every few years), the purchase price of our 732B, the 3458A, and testing and development of the SVR program, the $10 calibration fee does not go far. This is a service, not a "trick" to hide some hidden specification or flaw.  In most normal business operations $10 is spent opening an incoming box. Then we use bench space to warm it up, check the 3458A against the Fluke 732B, carefully calibrate the SVR board, prepare a report, prepare a shipping label, pack it with a new conductive bag, over bag, bubble wrap, and peanuts in a new small box.  Then we deliver it to the post office.  The board can then provide another accurate voltage transfer.  In the US anyway, that means the SVR program operates at a loss.

As many hobbyists have found that 0.01% is good enough, and I turn my attention to our magnetometer project, we are re-evaluating whether the program should continue.  However, since we recently obtained a new batch of ceramic AD587LQ's from a power supply manufacturer that changed their design, we will continue in the short term ... with a new batch of boards for sale in about a couple of weeks.

Regards,
 
Joe Geller
GELLER (Geller Labs)
http://www.gellerlabs.com
« Last Edit: July 12, 2010, 03:05:14 pm by Joe Geller »
 

Offline Rhythmtech

  • Regular Contributor
  • *
  • Posts: 189
Re: interested in hobbyist voltage reference standards?
« Reply #35 on: July 12, 2010, 01:45:40 pm »
I am sold, a traceable calibration that includes the hardware for $35 what were you thinking?
 

Offline slburris

  • Frequent Contributor
  • **
  • Posts: 523
Re: interested in hobbyist voltage reference standards?
« Reply #36 on: July 12, 2010, 02:03:51 pm »

To some, all these factors are taken as deceitful, however that is exactly the opposite of what is intended.  


One of the problems in trying to express things in text....

"Hiding the drift" was meant in the sense of "compensating for the drift" or
"hiding the drift from the end user application", not
in terms of concealment or deceit.  Perhaps a poor choice of words.

Recasting the original question about voltage standards, it seems to boil down to
the following:

State of the art in terms of reference chips initial accuracy is about 0.01%,
i.e. the Analog Devices, TI, or Intersil devices.  They can drift over time
and temperature up to their published specs.

For better initial accuracy, get a reference that has been trimmed to
a higher commercial reference and will hold that setting for a reasonable
period of time, i.e. a transfer standard, such as the Gellar SVR.
This will drift over time and should be sent back periodically to be resynced
to the higher standard.  You are still subject to temperature drift, short of
some sort of ovenized reference.

Although there has been a leap in technology for frequency references
for hobbyists, i.e. GPSDOs, there has been no corresponding leap for
voltage.  Nothing at the hobbyist level exists to convert that excellent
frequency standard into a voltage standard.

Or you can just punt on all of this and send your equipment out to a
calibration lab.

Is this a fair summary?

Scott
 

Offline Joe Geller

  • Contributor
  • Posts: 24
Re: interested in hobbyist voltage reference standards?
« Reply #37 on: July 12, 2010, 03:42:17 pm »
yes, I think that is a good summary.  For long term performance (6 months to one year), especially at 0.001% (10ppm) to 0.0001% (1ppm), a reference chip generally cannot match the performance of a used Fluke voltage calibration box (in good condition) or a high end DMM with an aged reference.  

A single scaling resistor used in most Fluke voltage calibration products costs more than $35.

Also, as noted earlier in the thread, the LTC1000Z http://www.linear.com/pc/productDetail.jsp?navId=H0,C1,C1154,C1002,C1223,P1204 (a modern ovenized reference chip) can be used to make a truly long term stable reference.  However, the lowest grade LTC1000Z costs ~$35 each in quantity, then it needs some support electronics (board, opamp, gain R's that cost ~$20 to $40 for ppm tracking (Vishay or Caddock), etc).
« Last Edit: July 12, 2010, 03:45:10 pm by Joe Geller »
 

Offline Rhythmtech

  • Regular Contributor
  • *
  • Posts: 189
Re: interested in hobbyist voltage reference standards?
« Reply #38 on: July 12, 2010, 07:32:22 pm »
A single scaling resistor used in most Fluke voltage calibration products costs more than $35.

Not to mentions what you would pay to have someone who owns a reference standard to calibrate your voltage reference.  You could not build up a voltage reference and expect a local calibration lab to calibrate it for less than $75, unless it were charity.

 

Offline saturation

  • Super Contributor
  • ***
  • Posts: 4788
  • Country: us
  • Doveryai, no proveryai
    • NIST
Best Wishes,

 Saturation
 

Offline Rhythmtech

  • Regular Contributor
  • *
  • Posts: 189
Re: interested in hobbyist voltage reference standards?
« Reply #40 on: August 16, 2010, 08:21:58 pm »
Interesting indeed.  Brings to mind the Physics vs Engineering conundrum that is always present - When is close enough, close enough?

An engineer, a mathematician, and a physicist are each presented with a beautiful woman and the stipulation that at each time interval, they may move half of the remaining distance towards her.

The mathematician concludes that after N iterations there will be 8 divided by 2N feet remaining which will never equal zero so he gives up on the spot.

The physicist opines that if each iteration requires a finite amount of energy then the energy expended in the approach will be inversely proportional to the distance remaining and gives up on the spot.

The engineer says "8 feet, 4 feet, 2 feet, 1 foot, 6 inches, good enough for practical purposes".
 

Offline saturation

  • Super Contributor
  • ***
  • Posts: 4788
  • Country: us
  • Doveryai, no proveryai
    • NIST
Re: interested in hobbyist voltage reference standards?
« Reply #41 on: August 17, 2010, 09:06:00 pm »
Yes, I see your analogy.  How good is good enough, with engineers being most flexible.

But I was seeing it in the light of this thread, of having a voltage reference for hobbyist or limited budget lab to use, that is as near the quality a reference lab would use.

Rather than take a standard NIST level calibration for voltage, can we cascade multiple off the shelf voltage references, preferably the best quality, so that the average effect of all of them working together, reduce their noise and drift over time, thus maintaining a reference voltage of high quality?

This is like in the late 1990s, instead of using a single large supercomputer, linking and using many off the shelf desktop PCs together can mimic superocomputer performance.

I'm still working on this idea, but it seems promising to take voltage references to a higher level of accuracy using off the shelf parts [ unless someone somewhere has done this already, if so please share].


Specifically from Pease's article:

"The LM199AH super-reference
When the new LM199AH came out about 38 years ago, it was designed as a new circuit using a new process to cancel out all probable causes for long-term drift. Its output tolerance was ±3%, but the long-term stability per 1000 hours was 0.0020% typical—and that’s 20 ppm.

We put in lots of preliminary tests to screen out bad ones and then put in comparison circuits so we could use an excellent six-digit digital voltmeter (DVM) to compare several reference sources, such as ovenized standard cells, an ovenized band-gap reference, and several other fairly good zeners.

By using multiple references, we could avoid problems in case all of the devices under test (DUTs) seemed to drift at the same time. Was that caused by all the DUTs drifting? No, because the other references showed the same dip at the same time, meaning that the DVM’s reference was to blame. And that effect could be “deducted,” or at least ignored.

One day I got mad and grabbed a big double handful of these LM199AHs, soldered in a group of four, and averaged their outputs with small resistors (499 ??). This output seemed quieter and less drifty. Well, let’s do it again. Soon I had four groups of four.

I compared the averaged output from eight LM199s to the other set of eight, and that was really good! Some tests showed less than 2 µV p-p for a limited bandwidth (4 Hz?). If I had averaged all 16, the output noise would have been even smaller! Most people don’t need to make such low noise as that, but by averaging several circuits, you can get a square-root advantage. Until you run out of steam, space, and power."

In the 2007 article:

"But if you have four groups of four, the chances that one will start drifting and won't be apprehended are quite small. Longterm drift can be fairly dependable. We sent some LM399s to the NBS/NIST, which found a long-term drift rate of about 1 ppm per 1000 hours when the die was self-heated to 88°C. "




From the IEEE proceeds, the ROC standards lab compared their lab's voltage reference against other countries by hand carrying 2 recently calibrated zener references over a month long comparison run:

http://ieeexplore.ieee.org/xpls/abs_all.jsp?arnumber=278554





Interesting indeed.  Brings to mind the Physics vs Engineering conundrum that is always present - When is close enough, close enough?

An engineer, a mathematician, and a physicist are each presented with a beautiful woman and the stipulation that at each time interval, they may move half of the remaining distance towards her.

The mathematician concludes that after N iterations there will be 8 divided by 2N feet remaining which will never equal zero so he gives up on the spot.

The physicist opines that if each iteration requires a finite amount of energy then the energy expended in the approach will be inversely proportional to the distance remaining and gives up on the spot.

The engineer says "8 feet, 4 feet, 2 feet, 1 foot, 6 inches, good enough for practical purposes".
« Last Edit: August 17, 2010, 09:20:08 pm by saturation »
Best Wishes,

 Saturation
 

Offline saturation

  • Super Contributor
  • ***
  • Posts: 4788
  • Country: us
  • Doveryai, no proveryai
    • NIST
Best Wishes,

 Saturation
 

alm

  • Guest
Re: interested in hobbyist voltage reference standards?
« Reply #43 on: August 22, 2010, 06:03:26 pm »
Cool stuff. But eight LTZ1000 boards are not exactly hobbyist-level, I believe the LTZ1000 costs something like $50 each, and that's without any supporting circuitry (I think you need some precision components there, too). The noise of the LTZ1000 is a mixed bag, some are pretty good, some are pretty bad, so you might also have to select them for noise.
 

Offline rf-loop

  • Super Contributor
  • ***
  • Posts: 3143
  • Country: cn
  • Born with DLL21 in hand
Re: interested in hobbyist voltage reference standards?
« Reply #44 on: August 23, 2010, 05:13:12 am »
The noise of the LTZ1000 is a mixed bag, some are pretty good, some are pretty bad, so you might also have to select them for noise.

Do you have some data about this. And what part of noise spectrum you talk. Manufacturer data: 1.2uV typical, 2uVp-p maximum at 0.1 - 10Hz. Many of other voltage references data tell only "typical". Linear tell also maximum. If manufacturer tell maximum you can return component what are out of specs. (but not very easy to proof with measurements). Example very good (but not extremely good) reference REF102C have "typical" 5uVp-p at 0.1 - 10Hz. also called as 1/f noise or "flicker" noise. (most difficult noise type .... and if f noise is near zero we talk voltage stability.

If compare this to LTZ 0.7 x 5 = 3.5uV (7V / 10V)  LTZ1000 have 1/3 of this noise. RMS for this kind of noise and level is around 0.3uV if look this maximum 2uVp-p. If look this noise peak values it means around +-0.15ppm (0.3ppm p-p).

Of course this need heavy and difficult filtering becouse you need add many variables to system and it need keep clean and absolute voltage level eextremely accurate.

But what means "pretty bad" = how much is this pretty bad?
Is it more bad than datasheet?

Broad band noise. Not so difficult case. Zener is always heavy noise source. Sometimes used also for this purpose.

How is LTZ long term stability and noise if "oven" is set to different temperatures ( HP use very high temp these... and 30nHz "noise" is littlebit bad, and also it affect other things)

I have not data becouse my measurement system resolution is not enough for numbers what I can trust, but some indirect signs I have that example 8 parallel REF102C is nearly "good" (8 parallel for more low 1/f noise). Better than only one. I have not system for accurate measure this "flickering" noise. But with lowpass filter corner freq is 0.1Hz I can get V ref 10V where noise go clearly under my system resolution)

(btw this chinese forum is very interesting...)

« Last Edit: August 23, 2010, 05:21:14 am by rf-loop »
If practice and theory is not equal it tells that used application of theory is wrong or the theory itself is wrong.
-
Harmony OS
 

alm

  • Guest
Re: interested in hobbyist voltage reference standards?
« Reply #45 on: August 23, 2010, 07:45:30 am »
Can't find the reference right now, sorry, but when I was searching for experiences with the LTZ1000, I found some reports claiming that. Probably not worse than the max. spec from the datasheet, but significantly worse than some other samples. I'm sure companies like Agilent and Fluke either select them or pay Linear lots of money for selected references. My conclusion was to get a really good reference, I would have to buy multiple LTZ1000s ($50 x n), and find a method of selecting them. Plus some precision resistors.

High temperature is not great for noise and stability, but is obviously good for tempco.
 

Offline rf-loop

  • Super Contributor
  • ***
  • Posts: 3143
  • Country: cn
  • Born with DLL21 in hand
Re: interested in hobbyist voltage reference standards?
« Reply #46 on: August 23, 2010, 02:21:59 pm »

High temperature is not great for noise and stability, but is obviously good for tempco.

I do not believe this. (yes it is some kind of "easy road" but... not good.)  Why need thoink tempco if there is not temp variations? Example with OCXO it have long long history to make very accurate and easy temperature stabilization. (why they are with quite high temp... it is becouse crystal tempco curve turn point is there. Zener tempco curve turnpoint is not problem... becouse tempco is quite low and also this kind of oven temp control accuracy what this need is easy.  Fluke put all "inside" oven... use LTZ more low temp... and what we have. Surpricing low drift transfer reference. (Also Fluke have made modification to HP3458A what is nearly alone in DVM's highest high-end class. Fluke modification give better long time stability afaik)

Long time drift is more bad problem than tempco. Tempco is not problem becouse you can go over it. Solution is simple. Do not change temp. Good oven (and always on with back up power). There temperature do not change and there do not need think tempco anymore. But of course temp and temp diffs need carefully desingn becouse thermal EMF is BIG problem if desingn is not perfect. I think one secret of some Fluke stability is just oven and specially quite low temp oven. If compare 80-90 celsius to example 60 or even 40 celsius there is huge advantage with ageing drift and maybe also noise. Also more low oven mean less thermal hysteresis. (maybe old aged LTZ is quite good with thermal hysteresis but still it is there.)  Of course LTZ itself is "simple" with its small oven itself but ... datasheet read tempco... why? Alone it is good reference and it is enough example for HP3458A but if need better...  ovenize it and whole around electronic to accurate adjusted oven or couple of LTZ's...

noise: Yes I am sure there are differencies and most single LTZ have not specified max noise. Also "typical" noise is just some kind of average. This means that with good selection you can find single's where noise is extremely low. This kind of differencies may be between manufacture lots and also inside single lot of course. Distribution inside lots is maybe gaussian. If can select example 1% from some big lot then maybe you have extremely low noise LTZ's (example select 1 or 2% best piece of cake.) Yes manufacturer maybe can do it for you but it need ... yes -- Money and (or) other value.. example "name".

If practice and theory is not equal it tells that used application of theory is wrong or the theory itself is wrong.
-
Harmony OS
 

alm

  • Guest
Re: interested in hobbyist voltage reference standards?
« Reply #47 on: August 23, 2010, 07:01:09 pm »
I do not believe this. (yes it is some kind of "easy road" but... not good.)  Why need thoink tempco if there is not temp variations? Example with OCXO it have long long history to make very accurate and easy temperature stabilization. (why they are with quite high temp... it is becouse crystal tempco curve turn point is there. Zener tempco curve turnpoint is not problem... becouse tempco is quite low and also this kind of oven temp control accuracy what this need is easy.  Fluke put all "inside" oven... use LTZ more low temp... and what we have. Surpricing low drift transfer reference.
Not sure what your point is? I agree that putting it in an oven is the easy way out, but it's a definite improvement. Even if you have a device with a very low tempco, it will be even lower at an almost constant temperature. Are there any extreme precision references that are not kept at a constant temperature? It seems a cheap improvement for any reference. As for the high temperature, you want to be significantly above the highest operating temperature, since just heating is much simpler than both heating and cooling.

(Also Fluke have made modification to HP3458A what is nearly alone in DVM's highest high-end class. Fluke modification give better long time stability afaik)
The Fluke modification indeed improved stability, and was just a single resistor change as far as I know (plus new calibration). Did that resistor have anything to do with the LTZ temperature?

Long time drift is more bad problem than tempco. Tempco is not problem becouse you can go over it. Solution is simple. Do not change temp. Good oven (and always on with back up power). There temperature do not change and there do not need think tempco anymore.
That's what the LTZ1000 does.

But of course temp and temp diffs need carefully desingn becouse thermal EMF is BIG problem if desingn is not perfect. I think one secret of some Fluke stability is just oven and specially quite low temp oven. If compare 80-90 celsius to example 60 or even 40 celsius there is huge advantage with ageing drift and maybe also noise. Also more low oven mean less thermal hysteresis. (maybe old aged LTZ is quite good with thermal hysteresis but still it is there.) 
Agreed about thermal EMF. The operation temp spec is probably quite low for the Fluke reference. The Agilent 3458A is specced from 0-55C, an oven temp of 40 or 60 is not going to work well in a 55C environment. That alone may be a reason for the high oven temperature. But I agree that a lower temp would probably be better for noise and stability.

noise: Yes I am sure there are differencies and most single LTZ have not specified max noise. Also "typical" noise is just some kind of average.
Typical may not be the average. It has no real meaning. It might be the mean, or it might be that they once saw a part that was that good. Bob Pease considers the typical specs worthless (and he probably wrote his fair share of datasheets), and only looks at min/max.
 

Offline rf-loop

  • Super Contributor
  • ***
  • Posts: 3143
  • Country: cn
  • Born with DLL21 in hand
Re: interested in hobbyist voltage reference standards?
« Reply #48 on: August 24, 2010, 05:29:24 am »
I do not understand what is problem with any component (including LTZ1000) tempco if temp do not change outside of component. Yes ok...after this we have this total system tempco. Outside oven temp change affect littlebit inside oven. This is not problem with V refs becouse oven temp is not so critical. Best OCXO's are mostly DOCXO (double oven) but only becouse desinggn is poor and want cheap solution. (yes there are single oven OCXO's what are better than most of double ovenised. (btw. manufacturer was HP) It was extremely clever designed oven.

These ideas from good OCXO's can use also for hobbyist Vreference becouse no need think manyfacturing costs. Of course some good instruments use this for Vref... also with very very poor method as LM399 (some isulation on the chip.. if accuracy is ok... no problem.

I am not sure about Fluke mod for HP3458A but I have read some opinion that this resistor just affect to temperature.

If specs temp max is +55 I can ask do hobbyist or other need extremely good Vref in this temperature. So or so but... 80 celsius can drop to 60 celsius or only ten... to 70. I have heard that 10 celsius drop maybe affect lot of advantage for ageing. maybe even half.

If practice and theory is not equal it tells that used application of theory is wrong or the theory itself is wrong.
-
Harmony OS
 

alm

  • Guest
Re: interested in hobbyist voltage reference standards?
« Reply #49 on: August 24, 2010, 05:58:49 am »
I do not understand what is problem with any component (including LTZ1000) tempco if temp do not change outside of component. Yes ok...after this we have this total system tempco. Outside oven temp change affect littlebit inside oven. This is not problem with V refs becouse oven temp is not so critical.
Sure, putting the reference in an oven would mostly negate the tempco (apart from the tempco of the oven itself ;)). The LTZ1000 or LM399 is basically an ovenized reference, not much point in putting these in an oven. An oven is larger, probably more expensive (at least in a commercial setting) and uses more power (larger volume). Does it really matter if the manufacturer put it in an oven or if someone else did?

These ideas from good OCXO's can use also for hobbyist Vreference becouse no need think manyfacturing costs.
True, you could build a custom oven. But are there references with good stability but bad tempco?

Of course some good instruments use this for Vref... also with very very poor method as LM399 (some isulation on the chip.. if accuracy is ok... no problem.
What's wrong with the LM399?

If specs temp max is +55 I can ask do hobbyist or other need extremely good Vref in this temperature. So or so but... 80 celsius can drop to 60 celsius or only ten... to 70. I have heard that 10 celsius drop maybe affect lot of advantage for ageing. maybe even half.
Sure, for hobbyist purposes you could use a much lower temperature, just trying to explain HP's rationale.
 


Share me

Digg  Facebook  SlashDot  Delicious  Technorati  Twitter  Google  Yahoo
Smf