I'm not saying that USB scopes couldn't be priced lower (I think Velleman has made inroads in this direction), but I think you're a bit out of touch with the market. The Picoscope 2207 - which has the 100MHz bw and 1GS/s rate - plus a built-in AWG and function generator - lists for $740 on the Picoscope website.
Stuff like measuring in the sample data
Gating for measurements
the Agilent DSX3000 has no button to switch between auto/normal trigger mode though this is something that you need all the time.
Generally, I'm underwhelmed by the displayed precision for cursor positions and measurements. With 1GSa/s and above, it should be somehow possible to measure if a pulse is 100µs or 100.01µs. That's 10ns and with 1GSa/s 10 points in the sample data.
I REALLY wish there was competition , i mean like okay 1Gigsamples ... What's the speed of a standard ADC converter in a off the shelf notebook you find now?
They have 24 bits 5gigsamples ADC's in there.
AND quad-core 1.4GHz 35W chips ... i mean what's stopping agilent to make cheaper stuff ? They have HP by their side and a fab of their own ( they make the megazoom asics so i assume they have a fab of theirs )
And only 40k record length. This kinda destroys all possible benefits. A 2ch scope with such a tiny sample memory buffer doesn't look like a real bargain for this price.
Apart from this, the problem remains that USB based devices with vendor specific drivers will quickly become outdated as will the control software. But you can kinda bet, that even big brands won't support you for ten years with USB drivers and software updates. Not talking of small companies that might not even exist any more in ten years. This is an acceptable risk for a logic analyzer for a few hundred €, but it's just a bad idea for scope that costs a few thousand €.
So honestly, I don't believe that USB scopes in their current form will ever take a big share of the market and USB3 won't change this in any way.
It may be usable on lower sampling rate, just like the long memory option on Rigol scopes, but at that point you're saving the relatively cheap DRAM, not the expensive RAM that can run at full speed. Not sure how much BOM costs this saves, especially if you factor in the premium for USB 3.0 transceivers.
Well, it seems pointless to argue this. You don't seem to feel that USB 3.0 scopes will make any difference in the market - I disagree. We'll just have to wait and see.QuoteSure, it has more features, though I'm not sure the Agilent is a fair comparison, since its banner spec is the fast update rate. My point was that this is the cheapest scope made by Picoscope (and I can't think of too many cheaper competitors) that exceeds the entry level scope which has been dominated by the Rigol DS1052E/1102E for the past five or so years. And you pay $600 extra for the lack of screen and controls.
I'm not saying that USB scopes couldn't be priced lower (I think Velleman has made inroads in this direction), but I think you're a bit out of touch with the market. The Picoscope 2207 - which has the 100MHz bw and 1GS/s rate - plus a built-in AWG and function generator - lists for $740 on the Picoscope website.
Also, I don't personally feel the lack of a tiny, old-fashioned 320x240 screen and some cheap controls matters if you get a more powerful tool with greater capabilities. It seems that at least some of the disdain for USB devices on this blog comes from 'traditionalist' engineers (and the like) who seem to believe that any piece of test equipment that isn't a standalone device is not 'serious' equipment. This whole attitude strikes me as rather hilarious. I no longer own any standalone media devices (television, dvr, stereo, etc) - everything in my home being driven from HTPCs. I expect my lab to keep moving in that direction as well.
Agilent measures in the sample data , not on screen.
"Measurements and math functions will be recalculated as you pan and zoom and turn channels on and off.
As you zoom in and out on a signal using the horizontal scale knob and vertical volts/division knob, you affect the resolution of the display. Because measurements and math functions are performed on displayed data, you affect the resolution of functions and measurements."
explain 'gating' please ... do you mean triggering or do you mean the capability to stop a recording when nothing is going on and then continue ? Agilent calls this 'segmented memory'
Some of their low end scope have that but it is an option you need ot pay for. it is not standard.
Hitting a button repeadedly cycles through modes on some agilent scopes. so if you press the 'trigger' button the trigger menu comes up , hit the same button again and it changes the most logical setting ( in this case trigger mode ).
the way you do this is by making a split view and zooming in on both edges. Rising edge in window 1 , falling in window 2. Then you get the required number of digits. if you zoom all the way out they faststep the cursor and turn off the trailing digits since they never change anyway... There is no point in showing 1.1000000000 . As you zoom in you get more trailing digits.
I REALLY wish there was competition , i mean like okay 1Gigsamples ... What's the speed of a standard ADC converter in a off the shelf notebook you find now?
They have 24 bits 5gigsamples ADC's in there.
AND quad-core 1.4GHz 35W chips ... i mean what's stopping agilent to make cheaper stuff ? They have HP by their side and a fab of their own ( they make the megazoom asics so i assume they have a fab of theirs )what have you been smoking ?
notebooks with 5 gigasample ADC's ? and 24 bit ? you have no idea what you are babbling about do you ?
Agilent has nothing to do with HP anymore. they long ago went separate ways. Neither Agilent nor HP have a waferfab (for digital monsters) at their disposal. The fabs went to Avago..
Besides those fabs are for optical things.
I know who fabs the agilent ASIC's ( can't tell you though ... NDA .. ). Those chips are extremely expensive due to the large amount of SRAM that is on board. The sram has to be able to run at full speed. Second problem is that they use gigantic internal pathways and have hardware assisted processing. The development costs of the Lynx ( Lynx is one of the megazoom Asic's ) are staggering. With the relatively small volumes they run a single chip runs well over a thousand of dollars.
The chipset in a 35000$ agilent scope costs them easily 15K$ to just fab alone ... An infinivision chip with 64 of 128 meg ram can handle 1 channel (so you need 4 of these beasts... in that 35K scope) add the multistage pipelined A/D needed per channel on there and you hit that 15K$ mark without problems...
The slower/cheaper machines use external ram modules.But the asics are still very expensive.
I'd have to agree with marmad that USB 3.0 will allow attractive low price designs. It should allow streaming a couple hundred MBytes/sec.
I actually think PCIe based scopes are also ripe for massive improvements in price/performance. A PCIe 2.0 x16 slot can stream a few gigabytes/sec (my bland video card can transfer 5+ GBytes/sec from RAM). It should be possible to stream the sample data directly into GPU memory and then then have like 300 GPU cores processing the sample data. A cheap GPU has a gigabyte of RAM, and 300 GPU cores might be enough to process ALL the samples (i.e. like zero dead time), in real-time, for an intensity graded display. This will take a bunch of high performance software though, and the ability of oscilloscope companies to write good software has been mixed. I could actually imagine an open source GPU accelerated based oscilloscope, with the ADC card coming from somebody else, or multiple ADC card brands that all work with one piece of software. Notice that companies that make video cameras often don't make video editing software, and I could imagine a similar hardware/software split happening in test equipment. I'm probably biased though, I'm a server system software engineer, not an oscilloscope hardware designer. Still, for $150, I can now buy a TFLOP of computing power that pops into a $500 computer. A lot of GPU power is being built into the CPU now too. It would certainly be worth while for somebody to figure out what a modern GPU could do in terms of processing a sample stream. Like maybe 300 cores is enough to do software based triggering (i.e. finding cyclic patterns in the samples), making the ADC card even simpler. Being heavily software based, you could change the features by writing code. If it were open source, YOU could change the features, or you could pay for commercial software and let software folks change the features. So you need an analog front end, 1 to 4 ADC chips, an FPGA for PCIe bus interfacing, software, an appropriate computer with appropriate GPU.
What do you expect your average PC to do with this hundreds of MB/s?
...GPU...GPU...GPU...GPU...
If it let you do gigabit transfers, then I could see companies jumping on it because then all the scope would need to do is digitize the signal and send it to the computer.
There's no other benefit to building one as a company, and most users see the lack of a screen as a negative.
and most users see the lack of a screen as a negative.
This belies the fact that there are already dozens of USB scopes - introduced, for the most part, after the 2.0 spec. was adopted.
if I'm looking at my real-time waveforms on a 24" monitor.
If it let you do gigabit transfers, then I could see companies jumping on it because then all the scope would need to do is digitize the signal and send it to the computer.
Um... I think you mean gigaBYTE transfers, don't you? Because It DOES let you do gigabit transfers - 5Gb/s max. In terms of transferring captured data, probably something closer to half that (2.5Gb/s) when you consider overhead. That means it's theoretically feasible to build a USB 3.0 scope with a 300MS/s rate (30MHz bandwidth?) that sends the data directly to the PC.QuoteThere's no other benefit to building one as a company, and most users see the lack of a screen as a negative.
This belies the fact that there are already dozens of USB scopes - introduced, for the most part, after the 2.0 spec. was adopted. And again, I personally don't see the lack of a tiny 6" screen as a negative if I'm looking at my real-time waveforms on a 24" monitor. But, hey, that's just me.
Recently I just got 7102V and I'm satisfied. Till now I've not seen any of the previous problems like backlight, or Channel one noise yet ..., By the way I'll check more. Firmware version is 2.6.1
Could someone please tell me is there a latest release/model/version of the SDS1702 to look for when buying? I would like to avoid buying some old stock, if at all possible.
The newer ones come with LAN right ?
When we say LAN option does this mean an ethernet port like a printer so that a PC may connect to it ?