Johenix confused the explanation of the standard resistance ranges and the tolerance. They are two different things.
Just to be clear, a 1% resistor is definitely better then 1% accurate. Typically most of the resistors will be within 0.5% accuracy.
Accuracy doesn't always take account of long term drift, so you can have a 1% resistor that is within 1% at the time of manufacture, but after 2000 hours of use, it may be no longer 1% accurate. So it is important to look at the long term stability spec as well as the accuracy spec.
It is true that there is not much point making the full E96 range of 20% resistors but the selection of the resistance ranges is a choice for the manufacturer, and is not dictated by the tolerance.
For example, if you look at the very expensive, very accurate resistors (0.005% for example), they are often made with a relatively few standard values. The reason is that the customers often need very precise values (like 3,459.3 K) and so the customers often order custom the exact value they require rather then values from a standard range.
Johenix is correct in that the choice of standard values of the E12, E24, E96, etc ranges are approximately logarithmically spaced, but if you do the calculations, you see that the values are often a bit out - probably because someone thought that the calculated value wasn't as useful as a nearby value.
Theoretically, the E12 range should be 1, 1.2, 1.5, 1.8, 2.2, 2.6, 3.2, 3.8, 4.6, 5.6, 6.8, and 8.3.
However the actual values are 1, 1.2, 1.5, 1.8, 2.2, 2.7, 3.3, 3.9, 4.7, 5.6, 6.8, and 8.2
Richard