I'd like someone knowleadgeable about Picos to explain this.. I mean, what good will the 1 GSa/s do, if the signal is processed and displayed by the software, which has access to just 15-20 MSa/s??
Of course it can't function in such a way, so I basically don't understand how a USB scope actually work.
You may not understand how a DSO in general actually works, at least in practice. The scope can acquire and store samples at its advertised rate up to the stated memory depth and that typically takes a very short time, memory depth divided by sample rate. So for 10Mpoints and 1GSa/s, your acquisition time is 10ms. After the acquisition, the scope gets to work processing and displaying the samples and depending on the mode that it is in, it typically won't do another acquisition until all that is done. The time that it is busy like this is called 'blind time', and this is often very significant. Lets say it take the scope 90ms to do all the processing, giving you a blind time of 90%. You can see that the total throughput of samples is actually only 100MSa/s in the long run.
So in the case of the PicoScope, it normally works the same way. It takes a capture up to its internal memory depth then gets to work processing it. In this case, the sample memory is transmitted over the USB cable to the PC for processing and the time it takes to do that is just added to the blind time. Of course the faster processing in the PC and the ability of the PicoScope to retrigger right after the data has been sent may offset that somewhat.
The continuous sample rate is much, much lower and this also has an analogue with regular DSOs. They typically have a continuous sampling (roll mode) ability that will display a moving chart, but at a much lower maximum sample rate than the advertised maximum. Some may also have a digitizing or data recording function that allows you to continuously export, but again always at a much lower sample rate.
The key here is real-time memory access. In order for a scope to do 1GSa/s, it has to write at least 1Gb/s to memory in real time. That just isn't going to happen over a USB cable or even internally in a PC without a hardware buffer.