Do you believe that an 8 bit PIC microcontroller that runs with a faster clock rate would be more susceptible to noise problems than one running at a lower clock speed?
Yes, but significantly only when when running close to the upper limit (where voltage and timing tolerances are smaller).
We are thinking that at a higher clock rate, there are more clock edges per second, and therefore, more chances of a narrow noise pulse corrupting an instruction, and crashing the software?
The chances of a temporary glitch affecting operation at a critical time (eg. when data is sampled at the end of a bus cycle) is less
if the noise pulses are infrequent and short relative to the clock. But it doesn't reduce the effect of permanent corruptions (eg. bits flipped in RAM) at other times. If the MCU has a particular job to do that takes a certain number of clock cycles, and sleeps at other times, then there is
no advantage in lowering clock speed because the lower speed just exposes it to noise for a longer time.
Is it true that all microcontrollers should be run at the lowest possible clock rate, in order to reduce the chances of noise susceptibility?
No. Noise should be eliminated
before it gets into the chip. And just reducing glitches isn't enough. The device should be designed to
never malfunction - and recover gracefully in the (very unlikely) event it does. If actual glitching is detected then
you have a problem that needs to be addressed. Lowering clock speed until the glitches (hopefully) disappear is not a solution.
An MCU with internal ROM and RAM should be pretty much immune to outside interference, provided that the power supply is stable and pin voltages are kept within spec. This seems to be true in practice, since MCU manufacturers are not recommending lower clock speeds for noisy environments. If it was a problem then they would, because reliable operation is a top priority for embedded systems (much more so than eg. a gaming PC, which is run at the ragged edge to get best possible performance).