Bear in mind that, when considering signal integrity, it's always the edge rate that matters, not the clock frequency. It's not something that's taught particularly well, or emphasized frequently and clearly enough.
It may well be the case that an output driving 10 MHz has slower edges than one which drives 100 MHz, but not necessarily so. If an I/O pin has a rise / fall time of, say, 2ns, it could easily require the same SI treatment whether it's carrying a 100 MHz clock or a reset signal that's only active once.
On the other hand, if a device runs at 10 MHz, chances are its I/O pin drivers will be designed to have slower rise / fall times, so it'll be easier to breadboard with and will be generally more forgiving of a bad PCB layout for that reason.
Check the edge rate, not the switching frequency.