...but that said the update rate for most of the ranges beats the hantek, the hantek is appalling most of the time
While this again is a bit offtopic, I recently investigated into scopes in the 1k€-3k€ price range. And I'm kinda depressed right now, since all of them lack at least one of the features that I's like to have.
While this again is a bit offtopic, I recently investigated into scopes in the 1k€-3k€ price range. And I'm kinda depressed right now, since all of them lack at least one of the features that I's like to have.
Stuff like measuring in the sample data (only for Hameg and LeCroy I'm really sure that they use the sample data), statistics (Tektronix DPO2000 simply has none, others have strange implementations), gating for measurements (no Agilent scope seems to have sensible gating as they obviously think the zoom window would be good enough, the Hamegs only support a kind of gating for semi-manual measurements), counting pulses/edges (surprisingly the Rigol DS4000 seems to miss this point completely). Also basic features are surprisingly badly designed. E.g. as the Owon, the Agilent DSX3000 has no button to switch between auto/normal trigger mode though this is something that you need all the time.
Generally, I'm underwhelmed by the displayed precision for cursor positions and measurements. With 1GSa/s and above, it should be somehow possible to measure if a pulse is 100µs or 100.01µs. That's 10ns and with 1GSa/s 10 points in the sample data. Still, scopes in the <8k€ range seem to cripple their measurements by either measuring in the display data or at least showing only three or four digits. The best result you can expect is something like 100.0µs, which reduces the accuracy to 100ns. Better than nothing you could say, but still not enough for certain purposes and I still think that a 1GSa/s scope should allow you to measure a period down to the sample period (1ns), even if there will be a certain jitter of course.
Then again, the SDS lacks all of these features. It's nice for observing the shape of waveforms and you can do rough measurements with it, but for precise measurements, it's completely useless. I'm not convinced though, that any other scope in this price range is really much better in this regard.
While this again is a bit offtopic, I recently investigated into scopes in the 1k€-3k€ price range. And I'm kinda depressed right now, since all of them lack at least one of the features that I's like to have.
Stuff like measuring in the sample data (only for Hameg and LeCroy I'm really sure that they use the sample data), statistics (Tektronix DPO2000 simply has none, others have strange implementations), gating for measurements (no Agilent scope seems to have sensible gating as they obviously think the zoom window would be good enough, the Hamegs only support a kind of gating for semi-manual measurements), counting pulses/edges (surprisingly the Rigol DS4000 seems to miss this point completely). Also basic features are surprisingly badly designed. E.g. as the Owon, the Agilent DSX3000 has no button to switch between auto/normal trigger mode though this is something that you need all the time.
Generally, I'm underwhelmed by the displayed precision for cursor positions and measurements. With 1GSa/s and above, it should be somehow possible to measure if a pulse is 100µs or 100.01µs. That's 10ns and with 1GSa/s 10 points in the sample data. Still, scopes in the <8k€ range seem to cripple their measurements by either measuring in the display data or at least showing only three or four digits. The best result you can expect is something like 100.0µs, which reduces the accuracy to 100ns. Better than nothing you could say, but still not enough for certain purposes and I still think that a 1GSa/s scope should allow you to measure a period down to the sample period (1ns), even if there will be a certain jitter of course.
Then again, the SDS lacks all of these features. It's nice for observing the shape of waveforms and you can do rough measurements with it, but for precise measurements, it's completely useless. I'm not convinced though, that any other scope in this price range is really much better in this regard.
I am very, very new to the idea of digital scopes and sampling used for displaying a trace but:
Some rough calcs : with 1 billion samples per second rate, I'd want say 20 samples per period to be able to show a decent curve, and that would mean a max input frequency of 50 MHz. If the curve is more complex I'd want even more samples per period meaning less max frequency capability. In that respect if the scope is being sold as a two channel 50 MHz scope I'd think it's about right. At 100MHz that would leave 10 samples per period on one trace and 5 samples per period for dual trace. Can you really show a trace with 5 or 10 samples over its period? I leave that to the experts to answer
Then it is also the sampling depth , or whatever else we call this : "bits per sample" - that means if the screen has 8 vertical divisions, how many bits do we allocate per division? I'd hope we would allocate at least 3 bits per division (0 to 7, ie seven distinct levels within each verticule) making this 3 * 8 = 24 bits per sample. Modern VGA cards work at 32 bits per pixel and at GHz levels so I do not see a problem here. But something tells me that all these digital scopes do not have 24 bit sampling depth - am I right?
I REALLY wish there was competition , i mean like okay 1Gigsamples ... What's the speed of a standard ADC converter in a off the shelf notebook you find now?
They have 24 bits 5gigsamples ADC's in there.
I REALLY wish there was competition , i mean like okay 1Gigsamples ... What's the speed of a standard ADC converter in a off the shelf notebook you find now?
They have 24 bits 5gigsamples ADC's in there.
AND quad-core 1.4GHz 35W chips ... i mean what's stopping agilent to make cheaper stuff ? They have HP by their side and a fab of their own ( they make the megazoom asics so i assume they have a fab of theirs )
I think that DSO prices in general are rather depressing. They certainly seem overpriced compared to other electronic goods - and while you can argue that they are 'test equipment' and about development costs - to me it seems to have at least as much to do with history and captive market as anything else: the analog scopes cost X amount - and since these are similar (or better) in capabilities (though cheaper in terms of manufacture, materials and calibration), they should still cost X amount too.
Yep. Modern VGA cards working at 32bits @ 1680x1050 as i'm typing away ... i'm not too sure if they run at GHz levels OR not , at least i know the pixel clock on mine is at 550MHz but i agree it's GHz levels because of the 2.2 Gigapixels/sec pixel fill rate ( It's a Mobility HD 5430 anyway )
Some rough calcs : with 1 billion samples per second rate, I'd want say 20 samples per period to be able to show a decent curve, and that would mean a max input frequency of 50 MHz. If the curve is more complex I'd want even more samples per period meaning less max frequency capability. In that respect if the scope is being sold as a two channel 50 MHz scope I'd think it's about right. At 100MHz that would leave 10 samples per period on one trace and 5 samples per period for dual trace. Can you really show a trace with 5 or 10 samples over its period? I leave that to the experts to answer
Then it is also the sampling depth , or whatever else we call this : "bits per sample" - that means if the screen has 8 vertical divisions, how many bits do we allocate per division? I'd hope we would allocate at least 3 bits per division (0 to 7, ie seven distinct levels within each verticule) making this 3 * 8 = 24 bits per sample. Modern VGA cards work at 32 bits per pixel and at GHz levels so I do not see a problem here. But something tells me that all these digital scopes do not have 24 bit sampling depth - am I right?
I REALLY wish there was competition , i mean like okay 1Gigsamples ... What's the speed of a standard ADC converter in a off the shelf notebook you find now?
They have 24 bits 5gigsamples ADC's in there.Where? The only analog input signals that come to mind are audio and wifi, and neither has anywhere near 5Gb/s bandwidth.AND quad-core 1.4GHz 35W chips ... i mean what's stopping agilent to make cheaper stuff ? They have HP by their side and a fab of their own ( they make the megazoom asics so i assume they have a fab of theirs )As was already answered by 0xdeadbeef, scale. Look at the number of CPU's Intel sells compared to the number of scopes Agilent sells. I can't be bothered to do the research now, but I expect many orders of magnitude difference.I think that DSO prices in general are rather depressing. They certainly seem overpriced compared to other electronic goods - and while you can argue that they are 'test equipment' and about development costs - to me it seems to have at least as much to do with history and captive market as anything else: the analog scopes cost X amount - and since these are similar (or better) in capabilities (though cheaper in terms of manufacture, materials and calibration), they should still cost X amount too.Analog scopes were much more expensive in their heydays. In 1959, the 30 MHz Tektronix 545A cost $1550 without the essential plug-in amplifiers, about $12,000 in 2011 money. In 1977, a 100 MHz Tektronix 465 cost $2145, which would be something like $8000 in 2011 money. In 1993, when DSOs were already available, the 100 MHz Tektronix 2245A (cheaper construction, less parts) cost $2595, about $4000 in 2011 money.Yep. Modern VGA cards working at 32bits @ 1680x1050 as i'm typing away ... i'm not too sure if they run at GHz levels OR not , at least i know the pixel clock on mine is at 550MHz but i agree it's GHz levels because of the 2.2 Gigapixels/sec pixel fill rate ( It's a Mobility HD 5430 anyway )You're confusing a lot of specs. The pixel clock of a 1680x1050 VGA signal is most likely something like 150 MHz, and is produced by a DAC, not an ADC. Of those 32-bits per pixel, 8 bits are alpha channel and never transmitted to the monitor, and the other 24 bits are split across three different colors. Also note that VGA connectors on computers are usually outputs, not inputs. Unless you have a fairly old monitor, chances are that you're using a digital (DVI/HDMI/DisplayPort) connection, which does not involve any ADC/DAC. 550 MHz is most likely the clock rate of the (digital) GPU. 2.2 Gigapixels (texels?) / sec refers to the number of pixels the GPU/graphics memory can process per second.
Yes but only on 1K.
You are a hobbyist/semi-professional: you want large screen OR hackability - AND PC ctrl/comm is NOT important - get the Hantek.
Yep. Modern VGA cards working at 32bits @ 1680x1050 as i'm typing away ... i'm not too sure if they run at GHz levels OR not , at least i know the pixel clock on mine is at 550MHz but i agree it's GHz levels because of the 2.2 Gigapixels/sec pixel fill rate ( It's a Mobility HD 5430 anyway )
Then it is also the sampling depth , or whatever else we call this : "bits per sample" - that means if the screen has 8 vertical divisions, how many bits do we allocate per division? I'd hope we would allocate at least 3 bits per division (0 to 7, ie seven distinct levels within each verticule) making this 3 * 8 = 24 bits per sample.
Modern VGA cards work at 32 bits per pixel and at GHz levels so I do not see a problem here. But something tells me that all these digital scopes do not have 24 bit sampling depth - am I right?
I REALLY wish there was competition , i mean like okay 1Gigsamples ... What's the speed of a standard ADC converter in a off the shelf notebook you find now?
They have 24 bits 5gigsamples ADC's in there.
PC ctrl/comm is luckily not a dark hole anymore, we have all the information how to control these DSOs and how to understand the data.
I don't think USB 2.0 is what limits the current USB scopes. USB 3.0 is not fast enough for real-time sampling, so you still need the sampling memory.
The Picoscope 3200 series comes close, but costs something like $1000 for 100 MHz BW.
You'll always need buffer memory - but memory is cheap, so what? And it depends what you think 'fast enough' is. USB 3.0 could support real-time sampling rates of 300 MS/s (8-bit) uncompressed - and with aggregation you could likely push it over 1 GS/s. Of course, it depends on clever design and good drivers and software.
One bad pixel?
Not meaning to sound rude, but I am curious why you are even wasting you time trying to chase that down?
.....and you have two good probes now.
What more do you expect?
A new replacement scope? If so, just come out and tell the vendor so, and pack it all back up for a refund. Why drag it out?
ZAP Why beat around the bush about it with post-after-post, documenting with very detailed pictures of the perpetrator-dot. Get a new scope and be done with it.
One bad pixel?
Not meaning to sound rude, but I am curious why you are even wasting you time trying to chase that down?
.....and you have two good probes now.
What more do you expect?
A new replacement scope? If so, just come out and tell the vendor so, and pack it all back up for a refund. Why drag it out?
ZAP Why beat around the bush about it with post-after-post, documenting with very detailed pictures of the perpetrator-dot. Get a new scope and be done with it.
dear pullin-gs,
i am so sorry you had to read my message
i really don't know why you took the time to respond to it tho if it wasn't good enough for you !
oh i see you might be "the best message content estimator of this site" ...
for others people
i repost cleaner pictures because i received 4 private messages asking about them.
it seems that at least some people care about that kind of silly problems...
i routinely buy lcd 1920x1200 without ANY bad pixel, and i don't see why they sent me 2 bads probes on 4, maybe quality management fail somewhere.
so end of story,
i gave neutral review with comments to my ebay seller (who is a member here i think)
i didn't get any response back to my 2nd message from Owon,
for now scope is ok for my need and i will buy something better when i will have more money.
You'll always need buffer memory - but memory is cheap, so what? And it depends what you think 'fast enough' is. USB 3.0 could support real-time sampling rates of 300 MS/s (8-bit) uncompressed - and with aggregation you could likely push it over 1 GS/s. Of course, it depends on clever design and good drivers and software.
Yes, but to be fair, you should be comparing it to something like the Agilent DSOX2002A - and not the Rigol. Don't just look at it's sampling rate - look at it's collection of features, triggers, software, etc. It's miles above the Rigol in my opinion.
Has anyone so far got an opinion of which of these scopes comes with the best probes (I presume I mean smallest capacitance) / or with differential probes out of the box / or with the best ability to use the pseudo differential mode with two channels ?
I ask because I have been working with a "low" frequency device recently, 200 KHz, and am finding all sorts of things about my ageing scope and my probes that I had not thought of before.
It may be usable on lower sampling rate, just like the long memory option on Rigol scopes, but at that point you're saving the relatively cheap DRAM, not the expensive RAM that can run at full speed. Not sure how much BOM costs this saves, especially if you factor in the premium for USB 3.0 transceivers.
Sure, it has more features, though I'm not sure the Agilent is a fair comparison, since its banner spec is the fast update rate. My point was that this is the cheapest scope made by Picoscope (and I can't think of too many cheaper competitors) that exceeds the entry level scope which has been dominated by the Rigol DS1052E/1102E for the past five or so years. And you pay $600 extra for the lack of screen and controls.