What tolerances do you get with a good sine bar and gauge blocks?
A person who asks this question is not familiar with either the tools or the work practices required to get the best performance. I see no benefit to suggesting that the tolerances achievable by a master toolmaker using the best tooling made are relevant to such a question.
My response was based on what I thought I could reasonably expect to achieve with the $500 or so of Chinese tooling I have for doing such a setup. I might well do better, but I'd have to have another way of making the same measurement to have any confidence.
How accurately spaced are the rolls on the sine bar? I don't know.
How accurately round are the rolls? I don't know.
How straight and flat is the sine bar? I don't know
How parallel are the rolls? I don't know
How flat is that part of my surface plate? I don't know.
How accurate are these particular gauge blocks? I don't know.
How uniform are the temperatures? I don't know.
Most of the tolerances for the factors i mentioned are 0.0001". Moreover, a sine bar is limited to setting small angles. So there is an additional error term as the angle gets larger . IIRC my angle blocks are supposed to be good to 5 minutes. Things get very difficult if 1 C causes significant expansion. They are extremely difficult if 0.01 C causes significant expansion as is the case for a ruling engine.
As for the topic of this thread, it occurs to me that the hump in the 0.001" increment measurements is likely error in the screw pitch of the micrometer. So I shall be investigating that with a more robust setup by changing my initial reference position. I also have a few hundred nanoAmperes of noise to investigate and hopefully eliminate.
The optical windows appear to be clear epoxy fill and are anything but flat. So another experiment will be to lap the face of a sensor flat and test it. Figuring out a way to do that will take quite a bit of thought. They are quite difficult to handle because of the small size.