Continuity test
Results:
Continuity triggering stable to 15 Hz. (0-4V pk squarewave 50% duty)
Minimum pulse width reliably triggered is 15 ms. (@ 1Hz rate)
I noticed that on my unit (with the 15R1 firmware) the continuity test was very slow. However, when I switched off the filter in VDC-mode the continuity mode reacts fast again. So it seems the filter in VDC-mode influences the speed in other modes. I assume that during your tests the filter was off.
I would prefer that in a next firmware version, the filter is automatically switched off when in continuity mode.
Correct.
Yes I've noticed some filer on/off dependencies too. You are quite right, why would any filter be needed in continiuty tests.
Page 96 of the SDM3055 User Manual talks about Application of the Analog Filter and specifies the introduced error.
Is the reading of 0.007mV within spec?
Page 96 of the SDM3055 User Manual talks about Application of the Analog Filter and specifies the introduced error.
I think you are right, this describes the effect I'm seeing! So it is probably not a digital software filter that is used, but an analog filter in hardware that is activated.
I checked the manual for the other effect I mentioned (jump in measured temperature, see my earlier post). The manual states the dmm measures the cold junction temperature. Could it be that this is measured only once when entering temperature mode? So when you measure for a longer time and the cold junction temperature changes, the dmm doesn't compensate for it while being in the temperature mode. Only when you leave the mode and re-enter it, the cold junction compensation is done again.
Is the reading of 0.007mV within spec?
Yes.
Most low VDC range is 200mV
Specifications: Accuracy: +/- 0.015% from reading + 0.004% from range.
If reading (input V) is zero. It can still display +/- 8uV (0.008mV)
(also there is specs for temp and temp coefficient)
If range is 200mV DC and input voltage is 100mV exactly it may display between 99.977 and 100.023 and it is as specified (if cal temp have been +23 and it is now in +18 to +28 temp and if it have been least half hour on.)
Also it need note that measurement need follow example these "thumb of rules" and something more in this manual for Basic Fundamentals of measurements.
http://www.keithley.com/knowledgecenter/knowledgecenter_pdf/LowLevMsHandbk.pdfLook example page 3-3 about avoiding Thermoelectric EMFs as error source for low voltage measurements.
Page 96 of the SDM3055 User Manual talks about Application of the Analog Filter and specifies the introduced error.
I think you are right, this describes the effect I'm seeing! So it is probably not a digital software filter that is used, but an analog filter in hardware that is activated.
I checked the manual for the other effect I mentioned (jump in measured temperature, see my earlier post). The manual states the dmm measures the cold junction temperature. Could it be that this is measured only once when entering temperature mode? So when you measure for a longer time and the cold junction temperature changes, the dmm doesn't compensate for it while being in the temperature mode. Only when you leave the mode and re-enter it, the cold junction compensation is done again.
Hi Eric.
Your assumptions were indeed correct.
The SDM3055 does use an analog filter that engages when the unit is powered up. The filter can improve the measurement accuracy when measuring DC voltage which has some ripple on it. We calibrate the DMM when the filter is turned off so there is a small error when the filter is turned on. You can, of course, use the Relative function to null this small offset out of you measurement.
On your thermocouple question, we do use a TI IC to measure the ambient (cold junction) temperature and use this value to compensate the measured temperature value. The 3055 continues to look up the cold-junction value throughout the Temperature measurement. We are still looking into this problem to find the reason for the offset but it is proving to be difficult for us to reproduce in our lab. We will continue to work at it in order to solve this offset problem.
On your thermocouple question, we do use a TI IC to measure the ambient (cold junction) temperature and use this value to compensate the measured temperature value. The 3055 continues to look up the cold-junction value throughout the Temperature measurement. We are still looking into this problem to find the reason for the offset but it is proving to be difficult for us to reproduce in our lab. We will continue to work at it in order to solve this offset problem.
If I can provide any information to help you to reproduce the problem please let me know.
I just measure the temperature for 10+ minutes, then switch to VDC mode and immdiately back to temperature mode. If I measure the temperature for a few hours, I've seen the temperarture jump for as much as 3 deg C. The ambient room temperarture may have changed an equal amount in that time (but I'm not sure of that).
I've checked some values for the temperature jump effect. I recorded the temperature together with the voltage.
After powering up the unit, I get after 5 minutes a temperature value of 15.8 deg C (-5.8 uV) after switching to VDC and back to temperature I get a value of 16.4 deg C (-5.1 uV).
After another 10 minutes, I get a temperature value of 14.4 deg C (-81.6 uV) after switching to VDC and back to temperature I get a value of 16.0 deg C (-78.4 uV).
So it seems the measured voltage is stable (and correct), but the corresponding temperature has an offset. During the measurement the room temperature was stable at about 16 deg C (checked with another thermometer).
I hope this might help in diagnosing the problem...
Our dear friend Martin Lorton now has a third video posted on the Siglent SDM-3055 with the new faster firmware.
I'm off to watch it.....
Hi Everyone,
I have been comparing the SDM-3055 to my Agilent 34461A and noticed that the trend chart is not updating the display correctly when viewing ALL trend data. Attached are some photos of both units. I noticed this first with the R13 firmware. I also updated to the latest firmware version (15R1) and the bug is still present. The firmware is getting more stable, but it looks like there are still a few more major bugs to be squashed.
Has anyone else observed this issue?
I can confirm that each individual sample agrees to within approximately 0.1 mV, it just appears that the routine updating the display is not working correctly.
Cheers,
Al
Welcome to the forum.
Thanks for the feedback, I'll make sure Siglent is informed.
Thanks tautech ... I have been lurking on the forums for years, but never got around to registering or posting stuff.
Following up on the trend chart issue, I also saved the captured data to a USB stick and verified that the data collected is correct, so the bug is isolated to the graph updating routine.
I did notice another issue, this time with the acquisition delay. If you change it to manual mode and try to set the delay to 10 s by changing the unit from us to ms and then to S, it does not work - the value is not altered or applied (you can watch the trigger indicator at the top of the display). However, if you stay on the us scale and just keep increasing the number to 1000 us, then 100 000 us, it automatically changes units and applies the delay correctly. The next problem is that if you try to set the unit prefix back to say ms, it does not work. The range is stuck in seconds and a reboot is necessary to reset the state.
Sorry if the description is a bit convoluted, but even taking photos does not really help.
Cheers,
Al
I have been comparing the SDM-3055 to my Agilent 34461A and noticed that the trend chart is not updating the display correctly when viewing ALL trend data. Attached are some photos of both units. I noticed this first with the R13 firmware. I also updated to the latest firmware version (15R1) and the bug is still present. The firmware is getting more stable, but it looks like there are still a few more major bugs to be squashed.
Has anyone else observed this issue?
Yep, I see the same issue.
I checked using a function generator that outputs a ramp with a period of 15 minutes. On the trend chart I see, after a few hours, that the display "squeezes" the data. The oldest data (on the left) is stretched, while the newest data (on the right) is compressed.
Hi Everyone,
I have been comparing the SDM-3055 to my Agilent 34461A and noticed that the trend chart is not updating the display correctly when viewing ALL trend data. Attached are some photos of both units. I noticed this first with the R13 firmware. I also updated to the latest firmware version (15R1) and the bug is still present. The firmware is getting more stable, but it looks like there are still a few more major bugs to be squashed.
Has anyone else observed this issue?
I can confirm that each individual sample agrees to within approximately 0.1 mV, it just appears that the routine updating the display is not working correctly.
Cheers,
Al
Hello Al.
Thank you for reporting this.
Engineering is aware of the issue and i am told they are testing for a solution now. If I hear anything soon I will let you know. Otherwise, you might check our FW versions from time to time.
Steve
Thanks tautech ... I have been lurking on the forums for years, but never got around to registering or posting stuff.
Following up on the trend chart issue, I also saved the captured data to a USB stick and verified that the data collected is correct, so the bug is isolated to the graph updating routine.
I did notice another issue, this time with the acquisition delay. If you change it to manual mode and try to set the delay to 10 s by changing the unit from us to ms and then to S, it does not work - the value is not altered or applied (you can watch the trigger indicator at the top of the display). However, if you stay on the us scale and just keep increasing the number to 1000 us, then 100 000 us, it automatically changes units and applies the delay correctly. The next problem is that if you try to set the unit prefix back to say ms, it does not work. The range is stuck in seconds and a reboot is necessary to reset the state.
Sorry if the description is a bit convoluted, but even taking photos does not really help.
Cheers,
Al
I'll report this as well.
Thanks
For those that haven't seen it yet Martin Lorton has a follow up video in regards to the SDM-3055 firmware versions P15R1 and P15R2.
First a big thanks to Siglent as they fixed my reported problem were the measuring temperature jumped when switching between DCV and temperature. In a new release 15 R2 this problem is fixed
It did not change another effect I see during temperature measurement. See attached screen shot. The measurered value jumpes in 0.5 deg steps up and down before settling on the new value. I suspect it might not be a problem but the effect of the cold junction compensation which might be done in 0.5 deg steps. It is within spec but it looks strange.
What do you think?
Hmmm, as I interpret Martin Lortons video when the filter is activated a capacitor is connected to the input terminals? Thus loading the measured voltage with a substantial capacitance? That sounds really crusty, I hope I got this wrong.
Just received my Siglent SDM3055 yesterday. I immediately updated the firmware to the latest (1.01.01.15R1). I left the unit in continuity mode when I shut it down last night. This morning when I turned it on it went to beeping like crazy with random low resistance readings on the screen. I assumed the leads just happened to be touching but they were not. I unplugged the leads and it still kept beeping. I shut it off and then back on. It came up in DC volts and everything seemed normal. Tried to duplicate and it didn't have the issue. Tried it again and sure enough the problem is back. As I type it is going crazy. I will record a short video. Has anyone else seen this? Can anyone else check there unit for this behaviour?
Elrod
Just received my Siglent SDM3055 yesterday. I immediately updated the firmware to the latest (1.01.01.15R1). I left the unit in continuity mode when I shut it down last night. This morning when I turned it on it went to beeping like crazy with random low resistance readings on the screen. I assumed the leads just happened to be touching but they were not. I unplugged the leads and it still kept beeping. I shut it off and then back on. It came up in DC volts and everything seemed normal. Tried to duplicate and it didn't have the issue. Tried it again and sure enough the problem is back. As I type it is going crazy. I will record a short video. Has anyone else seen this? Can anyone else check there unit for this behaviour?
Elrod
Most unusual.
Can you confirm FW 15R1 is shown in the UI as the installed FW?
I'll also point Siglent to your post.
Has anyone else seen this? Can anyone else check there unit for this behaviour?
Elrod
I checked my meter and saw the same behaviour. I used firmware 15R2, but also saw it with older versions.
Has anyone else seen this? Can anyone else check there unit for this behaviour?
Elrod
I checked my meter and saw the same behaviour. I used firmware 15R2, but also saw it with older versions.
Siglent informs me this may be another bug and they are fixing it ATM.
Eric, Where did you get 15R2, I could not find a download link for it?
Eric, Where did you get 15R2, I could not find a download link for it?
I got it directly from Siglent as a pre release test for the fix they made in measuring the tempearure. They said it will be available from their site...
Tautech I can confirm my version is 15R1.
Anyone have any news on when an updated firmware might be released for the SDM3055? The last update was September I believe.