The value you see in a datasheet is not the value your particular, single op-amp has. If that was the case, it would mean the manufacturer intentionally makes your op-amp bad.

Instead, the datasheet gives you statistical properties of the manufacturing process. Conceptually it’s the same as with resistor tolerance, but instead of getting ±%ohms, you get ±volts. The way you treat it in design is the same, including accepting the value may further change with temperature or drift as the component ages.
If you want a deeper explanation: production is imprecise. The actual parameters all over the range. The manufacturer samples the process and gets a probabilistic distribution of values. Then a datasheet reports what is the shape of that distribution. Usually a normal distribution is assumed with µ = 0 mV, so that is implied. The datasheet then reports variance. What you see as “TYP(ical)” ±value is
the range 68.27% units fall into. The “MAX(imum)” ±value is unfortunately manufacturer-specific,
(1) but the interpretation is “being outside that range is as rare enough you should count that the same as buying a broken part”.
If a normal distribution doesn’t approximate reality well enough, which may be important for precision parts, you also get a chart depicting the actual sample distribution, so you can judge for yourself what to expect.
To sum it up, if a datasheet reports 1 mV input offset, it means your part has the its individual input offset somewhere between -1 mV and 1 mV. May be -0.8 mV, may be 0.4 mV, bay happen to be 0.007 mV.
(1) May be ±3σ (99.7%), may be wider, may be the threshold at which binning/QA discards items, may be “not seen in a sample”.