Re: the Keithley....
I'm honestly not sure why that exists. Say I'm looking at PA transistor current. As the transistor gets hotter, current goes up. Difficult to read trends over a 5 second period with any kind of digital metering. The real time trend graph is good for sure but you can see the same thing watching a needle creep up and it doesn't cost $1750 which is a bad ROI unless you spend all day looking at it. If I want to log something over time then use a generic bench meter with a PC interface and do it there. Very limited in what you can do with the data on a device when there's Excel and Mathematica available on something larger and more powerful.
Some analogue action from me to demonstrate the current metering response...
Not sure I get the use cases for it. I built similar test platforms years ago using GPIB and composition. Composing instruments is always more powerful than having a black box. You can reuse the instruments for other cases later once your production run is over.
Also as always with these things, they tout 6.5 digits, 1MS sample rate but look at the datasheet. To sample all those channels you need a channel switch with a 3ms rate change so say you have 80 channels, so including settling time that kicks you down to 1 reading/second over 80 channels and those channel switches have a pretty finite lifespan! Also it only has 512 memory slots for readings. When you start adding options, it makes some of the other competitors from Keysight look cheap.
On top of that you get to program it via a shitty touch screen UI made by the lowest bidder or some weird ass proprietary language and development tool chain. And I know something from experience; test gear companies can't write software.
We had similar stuff with two 34401A's and two channel multiplexers driven from a rack mount PC and visual basic that could do this sort of stuff back in the late 1990s.
Composition > monolithic devices.
You are referring to this, I assume?
http://thesignalpath.com/blogs/2019/08/04/keithley-daq6510-6-5-digit-data-acquisition-multimeter-review-teardown-experiments/
On dso's, I guess I don't get much of what they are about. Oscilloscopes are for looking at waveforms, right?
But in what cases is it needed to store waveforms rather than just looking at them in realtime?
Meters are for accurately checking voltage and current. Spectrum analyzers are for looking at the frequency domain. Logic analyzers are for looking at digital signals. So then, why is so much piled into these low cost dso's?
There are no people saying "we should be using old analog ampermeters, those are better" That would be just silly.
Ooh, a challenge.
Try measuring charge with a modern meter.
At school we measured charge using a ballistic galvanometer. Principle: dissipate the charge through the meter in a time that is much shorter than the meter's response time. The needle is kicked and the maximum deflection is proportional to the current*time, i.e. charge.
There are no people saying "we should be using old analog ampermeters, those are better" That would be just silly.
Ooh, a challenge.
Try measuring charge with a modern meter.
At school we measured charge using a ballistic galvanometer. Principle: dissipate the charge through the meter in a time that is much shorter than the meter's response time. The needle is kicked and the maximum deflection is proportional to the current*time, i.e. charge.
Well I'm sure you will have trouble measuring liters or kilos with digital multimeter too.. Ballistic galvanometer is specialized instrument. If it was something commonly used, multimeters would have a measurement for that.
You could use a meter with high Z and know capacitor, and measure peak voltage.
Never to forget: the digital domain is a simplification of the analog domain and carries intrinsic limitations with it.
This would make a great tee-shirt. Sure would burst a lot of bubbles
There are no people saying "we should be using old analog ampermeters, those are better" That would be just silly.
Ooh, a challenge.
Try measuring charge with a modern meter.
At school we measured charge using a ballistic galvanometer. Principle: dissipate the charge through the meter in a time that is much shorter than the meter's response time. The needle is kicked and the maximum deflection is proportional to the current*time, i.e. charge.
Well I'm sure you will have trouble measuring liters or kilos with digital multimeter too.. Ballistic galvanometer is specialized instrument. If it was something commonly used, multimeters would have a measurement for that.
You could use a meter with high Z and know capacitor, and measure peak voltage.
Nonetheless, analogue ammeters are better for measuring charge, and using them for that purpose isn't silly.
That meter+capacitor technique won't work in many interesting cases.
This would make a great tee-shirt. Sure would burst a lot of bubblesWhat bubbles would it burst?
CatalinaWOW, I assume that you are looking for glitches in digital signals. I don't know if that will ever be a practical use case for myself. And if I do have a need for that in the future, I think I would have a better idea of what I need at that point (rather than now). And as tech marches on, there will likely be better tools for that task in the future.
Never to forget: the digital domain is a simplification of the analog domain and carries intrinsic limitations with it.This would make a great tee-shirt. Sure would burst a lot of bubblesWhat bubbles would it burst?For starters, anyone who looks at a digital display and assumes it more accurate than an analog one. I could get into recorded music, but I think you get my drift.
This would make a great tee-shirt. Sure would burst a lot of bubblesWhat bubbles would it burst?
For starters, anyone who looks at a digital display and assumes it more accurate than an analog one. I could get into recorded music, but I think you get my drift.
This would make a great tee-shirt. Sure would burst a lot of bubblesWhat bubbles would it burst?
For starters, anyone who looks at a digital display and assumes it more accurate than an analog one. I could get into recorded music, but I think you get my drift.
I would argue that in most cases a digital display is more precise, now whether it's also more accurate depends on a lot of things.
[..]
Disadvantages
1) DCOs are untruthful by their nature. Unless you know beforehand what you are looking for, what is on the screen may have nothing to do with the reality. Which creates a particular problem when you are troubleshooting... . You have to check and cross-check what you see to be sure.
[..]
james_s, I think touchscreens are a very good fit for specific purposes that do not require excessive precision - tracing/highlighting regions (for mask, for example) or quickly accessing menus or functions that float around the screen. Feedback mechanisms such as button highlighting or haptic also help a lot with the operation of touchscreens - usually haptic is quite responsive, but visual aids are heavily dependent on the CPU load of the HMI and are rarely well implemented.
Bring touchscreens to a precision fight is a very bad idea IMHO... Setting values on a dial or slider interface, or even trying to highlight or move a cursor on a line of text (so common when posting on EEVBlog from the cellphone, for example) is an exercise in patience with the occasional frustration. For those tasks, bring me a mechanical/physical interface any day.
Oh well... I wonder if our line of thought will become extinct as our generation ages...
That's why many modern cars stick up an annoying disclaimer on the centre screen at startup, pointing out the risk and asking you to confirm.
"Heads down" syndrome is implicated in many aircraft crashes, as can be determined from the two flight recorders. Unfortunately cars don't have those, so it will be difficult to prove.
That's why many modern cars stick up an annoying disclaimer on the centre screen at startup, pointing out the risk and asking you to confirm.I wonder if that is sufficient.