A given chip will give consistent operation (low jitter). The measurement is taken over manufacturing variation.
I don't know what statistics are used. I might guess about a 3 sigma cutoff, so you're very likely to get a typical part. (Note that the statistics of this particular parameter aren't going to be Gaussian, because the time certainly can't be negative. Likely it's a Gaussian warped by a physical process, so it has a minimum cutoff of so-and-so, a typical somewhat higher, and a long tail that is truncated by the max during testing.)
Ideally, you should design your system so that, even if a single chip were malfunctioning, in such a way that it had extreme jitter from cycle to cycle, but still met its timing spec, the system still works. Clocked state machine logic has this property, which is part of the reason why it's so popular. (The other being it's easy to synthesize!) Such designs aren't always possible (many systems require controlled jitter, high frequency purity, that sort of thing -- especially cutting edge signal processing systems), but this helps whenever possible.
Tim