Just how much scope bandwidth is "enough" for working with microcontroller projects? I mainly use ARMs running at around 100 MHz. Most of the peripherals are a lot slower (I2C, SPI, CAN, etc.) however. Sure, I'd love to have a 1 GHz scope, but my hobby budget won't allow it.
Obviously, "enough" will vary from user to user. But if your ARM processors are anything like the ones I work with, that 100 MHz is all internal. These are primarily controllers, after all. I think the fastest signal I've seen on the outside of the chip has been 50 MHz SPI, which turned out
not to actually work (25 MHz was the real Max, and the vendor had to errata it).
Then it depends on what it is you want to be able to see. If you just want to verify presence of a signal, and its frequency, you may be able to get away with a bandwidth twice as high as the signal itself. If it's just a sine, 1x may be enough, since at the BWL the amplitude will have dropped by only 3dB. But if it's a square/pulse wave of some type (what you'll be working with 99% of the time), you probably want to see something where the edges are visible, and not looking like a sine wave.
In that case, you want at least 5x for a good idea of the shape. But 10x to really be able to evaluate where the edges truly are.
That may be important when tracking down timing issues, where the transition point is dependent on where something happens on a leading or falling edge. Less than 10x, and you'll only be guessing (which sometimes
may be good enough, and sometimes, given financial constraints,
has to be good enough
).
So the short answer
would be, 10x if you can afford it, and 5x if you can't. Based on the signals you'll actually be examining, NOT the internal clock speed of the chip.
(Unless it brings out address and data lines to external memories... then we're talking about a different $$ context, though the same rules apply.)