Using the diff probe as an example:
Say you have two voltage sources, one 1000v, and one 1500v. Both have their negative terminal connected together ( in common ).
Connect one probe lead to each of the positive terminals of the voltage sources.
The you would read a 500v difference between the probe leads. This is the differential voltage.
If there were no connection whatsoever between the common point of the voltage sources and anything else ( espcially your measuring tools), then the differential voltage would tell the whole story.
If the common point of those voltage sources is shared with the same ground as the scope ( mains ground plug, and thus the shell of your scope's BNC connector ), then the scope side of the diff probe would see this. It would see 1000v between one of its leads and ground, and 500v between the other lead and ground. The common mode voltage would be considered 750v ( the average voltage to ground for the two leads ).
So, why does it matter?
You could have a modest differential voltage of 10v, due to one 10000v source, and another 10010v source. The differential voltage would cause no problem for your diff probe. But if the scope side of the diff probe shares the ground with the common point of the voltage sources, then the diff probe now sees 10000 volts between the leads and ground. This might destroy the probe.
Besides safety, you could be dealing with a low voltage differential sensor ( like a bridge type strain gauge ) and a differential amplifier. You read the sensor by judging ( small ) voltage difference between two outputs that both are 5 volts above ground ( e.g 5.01 volts vs 5.02 volts ). Your differential amplifier knows about the common ground point, and may not be very accurate when telling the difference between 5.01 and 5.02, but be fine with 1.01 and 1.02. The amplifier's specifications will tell you how it performs with various common mode voltages.
edit: minor correction based on Tim's answer