| Products > Test Equipment |
| 1689 LCR Meter Specifications |
| << < (3/4) > >> |
| zrq:
Here I attached some photos of the main board of my GR 1693 I took during battery replacement. I didn't find any difference with 1689 except the sticker on the ROM. The board also looks very similar to GR 1659's: https://www.eevblog.com/forum/testgear/quadtech-1659-rlc-digibridge-project/msg5534243/#msg5534243 I didn't dare to take off the ROM chip to make a copy, as it's too close to the calibration RAM and I'm afraid of losing the calibration constants if I bent the board too much and break the contact. Hope in the future someone with a newer make using Dallas BBRAM can make such a copy. |
| TimFox:
--- Quote from: rodpp on October 15, 2021, 02:19:25 pm ---One interesting thing about the 1689 is the calibration is performed using only a set of four known resistors of specific values: 95.3 kΩ, 5.97 kΩ, 374 Ω and 24.9 Ω. So, if you have a resistance decade box and means to measure these values with high precision (10ppm), you can "easily" calibrate the 1689. IET Labs sells this set of resistors as a kit, NIST traceable, but expensive: https://www.ietlabs.com/1689-9604-digibridge-calibration-kit.html --- End quote --- The inherent calibration of the GR Digibridges and later similar LCR meters relies on precision calibrated resistors and an accurate 90 deg phase difference between two AC waveforms at the test frequency. This includes later low-cost units such as the DE-5000. Earlier bridges relied on precision reference capacitors. The electronics measure the voltage across the DUT and the current through it with phase-sensitive detectors driven by the two quadrature reference waveforms. The precision resistor is used in the current measurement. The “calibrate” function in these meters (open and short termination) compensates for the parasitic components in the test fixture, while the impedance calibration itself uses the reference resistor. |
| zrq:
I noted (likely) several bugs in the firmware of my GR1693 (or some very subtle malfunctions, bit rot?), let alone the general annoyance when using the terribly designed UI and GPIB interface. The first minor one is sometimes the measurement have a higher noise, although still in spec, the noise can be much higher than the "2 ppm" variance as bragged in the application notes when using the slow mode, it can be in the range of 4 ppm to 7 ppm if it happens. When the instrument boots up, it goes by default to this noisy state, and interestingly when I switch the frequency from the default 1 kHz to 1.22 kHz (or many arbitrary values) and back to 1kHz, the noise gets much closer to the "2 ppm" like 2.8 or 3 ppm. Rebooting the digibridge can consistently reproduce this. The second major bug is when you set the frequency to 0.234375 kHz (=60 kHz/256), the reading starts to jump around all over the place, while setting the frequency to any nearby values can give perfectly correct and stable values. |
| zrq:
Here I attached the IH test voltage waveform for the "0.234375" kHz and the nearby 0.23622 kHz. There is something very wrong with the primitive DDS generation of the sine test signal. |
| zrq:
Given how close the 11.719 Hz is precisely 1/20 of the set frequency and 3kHz/256, I suspect that the pre-divider U19 of the test frequency is set incorrectly in this case to 0.96 MHz instead of 3.84 MHz, and the correct division factor of 256 is applied... Update: It gets a bit mysterious, IET support replied that they cannot reproduce the 0.23437 kHz problem with production 1689 and 1693, and they explictly said they cannot fix my unit given the age.. |
| Navigation |
| Message Index |
| Next page |
| Previous page |