General > General Technical Chat

Data acquisition and USB 3.0

(1/3) > >>


Even with the current DSOs it is no problem to utilize your PC for more processing and analysis. Even the cheap Chinese DSOs these days have USB interfaces, so you can get the data to the PC.

Mid-range and upwards DSO vendors have PC interfaces since ages. I regularly import data from my DSO into GNU octave for further processing. High-End vendors even deliver their own Matlab oscilloscope interfaces, e.g. Tek and LeCroy. AFAIR Agilent even has some instruments with a build-in PC board and Matlab running inside the instrument.

--- Quote from: OLSA on January 17, 2010, 01:43:05 pm --- (where is only analog part
--- End quote ---

Only? ONLY? The analog part is the hard part! The (digital) processing stuff is child's play compared to building a good analog frontend.

And "good" means not only 200 MHz or 500 MHz bandwidth for your 2 GHz sampler (good luck with that), but also 300 V CAT II safety (as opposite to a 5 V input range and a mayor fire incident at 10 V), linearity, defined input impedance, temperature stability, low aging, etc.

What people just don't understand when looking at the cost for a DSO is that you don't pay for the parts. It is not the (color) LCD screen, the knobs on the front plate or the CPUs and FPGAs inside that make them expensive. Oscilloscope makers can get that stuff rather cheap (not what you would have to pay for qty 1). It is the effort that went into designing it.

Every now and then you see some company coming up with a DIY DSO kit. All I have seen were junk as a DSO. Nice if you want to get some soldering practice, but utter crap as a measurement instrument. Not enough bandwidth, a cheap OpAmp as analog front-end, non or barely non input protection.

Oh, and good luck with calibrating your 2 GHz sampler. How many k$ (kilo-Dollars) are you willing to invest in measurement and calibration equipment?

Hi all,

I would never use a scope based on a PC or Mac etc. You risk loosing to much data due to the OS interfering with the timing. Even if you set every other event in the PC to a lower priority and the only highest priority was the data capture, you would still have problems.

The BEST way to do this is to have a DEDICATED unit that captures the data and THEN transfers it to a PC.

There was a thread a couple of days ago, in which we discussed isolation issues. What happens with a device that has say 240V applied to the input ? your external "adapter" would need to have very good isolation, and preferably its own grounding separate from the host computer, to stop your expensive PC / Laptop etc going up in smoke.

5 Gb/s of bandwidth is not that much in oscilloscope terms. Even assuming 100% payload efficiency, 5 Gb/s translates to 625 megasamples per second at 8 bits of resolution. However, it is likely to be much less.

Usually, the actual physical signaling rate is touted, which in reality turns out to be a quite bit less as the payload terms. I have not studied how the data is actually encoded on the superspeed USB, but I suspect they'll use somekind of 8b10b-encoding, where 8 bits are actually encoded as 10 bits. That encoding is needed for removing the DC-contents and making the receiver job easier to recover the data clock. That will reduce the bandwidth by 20% already. So 5 gigabits per second of line signalling rate becomes 4 gigabits per second of actual data. This is equivalent of 500 megasamples per second @ 8 bits.

And remember you want to use more than one channel. And you would want that the computer can sustain that data rate. The problem with the general purpose computer is that although they have pretty good computation rate, the bottleneck is still the IO and latency. General purpose processor is just not very fast switching tasks. It is pretty much like an aircraft carrier ship. It can do lot of things but it surely does not maneuver very fast. You would need also some kind of software real time oversampling to avoid trigger jitter.

Getting decent frequency response at the ADC is one problem, but I think that is not the biggest problem if one is familiar with microwave design methods. With only near-DC-experience, that can seem quite mysterious.

However, all the really high-end scopes are PC (mostly Windows XP) based computers with a specialized data acquisition hardware. I think they still use specialized hardware for the data acquisition to handle total of tens of gigasamples per second (say, 4 channels @ 20 gigasamples per second @ 8 bits is 640 gigabits per second, which is what our 6 GHz Agilent DSO at work is capable of).


You can imagine all you want. In engineering building it is what counts. So go ahead, build it.

What, you want us to build your imagination? Nice try.


[0] Message Index

[#] Next page

There was an error while thanking
Go to full version