Products > Test Equipment
Calibration of an ANENG AN870
bdunham7:
--- Quote from: fant on October 25, 2020, 04:37:11 pm ---My request was on how to calibrate the unit, then if it is close, far or completely out of order it is my problem.
And only one guy (that I thanks) helped me, the others just used bits to move air.
--- End quote ---
For someone with 3 HP3478s, you are asking a rather imprecise question.
You appear to be actually asking "how do I change the gain constants in the calibration memory" or perhaps "what is the manufacturer's calibration procedure", the second of which might be a better question if it weren't for the fact the the product is an unsupported basement-bargain POS. In either case, user floobydust apparently has some experience or knowledge and it looks like it would be a challenge, given the complete lack of support that you generally find for this level of product.
However, if you are motivated by anything other than idle curiosity as to how to enter new gain constants and you actually care whether the meter is accurate or not, you would want additional data, such as what does it read at 5.00000 V and 15.00000V? You certainly have the means to check that. And even if you only want to measure 10 volts for some reason, how consistent is that 10.0008V and under what conditions? Absent a specific manufacturer procedure, unless you take the time to characterize the meter for linearity, tempco and stability you aren't 'calibrating' anything, you are just doing the modern version of twiddling the knobs without a plan.
fant:
Good morning;
I use all the meters in the lab, temperature from 20 to 23 °C, look here: https://www.eevblog.com/forum/metrology/vintage-weston-element-is-it-any-good/msg3101976/#msg3101976, first picture.
Sorry if I have forgotten the words "calibration memory" but, if a person has already done the work, knows that there is an EEPROM inside.
Anyway, this is the workhorse we have (actually 5 units) in the lab and all are with a reading little bigger than the correct one.
For this reason I wanted to test on one of them the possibility to correct the reading.
For the official use in front of the customers we use the Agilent 34401 that is periodically calibrated by an external company.
best regards
2N3055:
--- Quote from: fant on October 25, 2020, 04:37:11 pm ---I think to know how they are made, having three 3458 in the lab ( one of them, the one on the picture, powered 24/7 ).
My request was on how to calibrate the unit, then if it is close, far or completely out of order it is my problem.
And only one guy (that I thanks) helped me, the others just used bits to move air.
Best regards
--- End quote ---
LOL.
No, other didn't just move air. They went out of their way to politely explain you that your question is, well, nonsense.. Nad wanted to save you grief of lost time for no particular result.
It makes no sense to adjust that very cheap instrument to be better than 500 ppm accuracy. Because it is not stable enough to keep that across even few °C around 23°C. It will also drift with humidity changes and generally drift because of components used.
It is a miracle that they are as accurate and stable as they are..
They merely wanted to point out it is not useful to do so.. If you want to do it for fun, without expectations that 2 weeks later it will hold adjustment, then by all means..
bdunham7:
--- Quote from: fant on October 25, 2020, 06:01:51 pm ---Anyway, this is the workhorse we have (actually 5 units) in the lab and all are with a reading little bigger than the correct one.
For this reason I wanted to test on one of them the possibility to correct the reading.
For the official use in front of the customers we use the Agilent 34401 that is periodically calibrated by an external company.
--- End quote ---
Fascinating that you have not one, but three reference meters that I can't possibly justify even one of, but your everyday 'workhorse' , in a use where precision appears to matter, literally costs less than my test leads. You have to understand that many will find that difficult to understand!
So, as for getting your meters to read accurately, is the high reading consistent over time and scale? Do you get 10.0008 10.008 volts from all of them every time you try? If not, how much variance? And how about 5.00000 and 15.00000? Is it 5.004 and 15.012? If not, what is changing the gain constant really going to get you? If so, and assuming you don't solve the issue of changing the calibration memory, perhaps you could simply try adding 8K resistance to the test leads and see if the results are acceptable. If so, find the 10M input resistor and add 8K to it. Or, remove and measure it and sub in a low-tempco resistor with about 8K more resistance. From another thread, I think that the 10M input is R29 + R30, but I'm not at all sure since I don't have an example here.
fant:
Nothing fascinating.
On the normal work, when you have to see the voltage of a power supply, you use the 870, when you have to see the reference for a 18 bit A/D converter to read altitude you use the 3458.
And during the test with customers only 34401 are allowed.
Anyway, the discussion is not going in the direction to solve the problem and for me is closed.
Best regards
Navigation
[0] Message Index
[#] Next page
[*] Previous page
Go to full version