No, this is totally false and in the vein of what you are saying is actually backwards. You may or may not have better clarity of vision than average. That is a different measure. Having 20/15 vision just denotes that you are a bit farsighted. This makes you worse at appreciating the resolution of a cell phone or computer screen than someone with 20/20 vision.
I have never took the American-style test, and maybe I made a wrong translation. The Chinese standard visual acuity tests are designed to test angular resolution directly, and a higher angular resolution corresponds directly to a higher score on the test. My raw score is 1.5 on both eyes, and for reference the usual "good" being 0.8-1.2, and 1.5+ being considered "excellent."
In practice, jitter is caused by unexpected delay in scene generation, typically because of storage I/O. That scene will be eventually displayed, but much delayed from the point in time it was supposed to represent. The study I linked to shows that even at 50ms (one twentieth of a second), the smallest jitter they tested, such a jitter affects the player results.
Some game developer actually do handle that, but by eating system memory. When I load up a scene in Cities:Skylines on my workstation, I usually expect ~80GB memory usage out of my 128 since the game is programmed to use up to a certain percentage of system memory, and since I have that much memory it just dumps a whole bunch of decompressed stuff into the main memory to combat jitter. Too bad most games are built to allow running on 16GB or even 8GB system memory, and jitter became unavoidable when storage can not catch up.
This can be tested though by comparing a slightly loaded mechanical hard drive (a lot of seek and load time) versus an idle NVMe SSD (virtually no seek time and extremely fast load,) which can be carried out on the same computer even.
No it's not a limit, it's average vision. It was something I was told a long time ago and it was about the ability to distinguish the dots not neccessarily focus on the dots. I suppose the other way of expressing it is the shortest break in a line that one can clearly see. so at 4 PPmm I think we are fairly safe to say that that is a good resolution and much more is just an overuse of technology.
Then explain to me why I can tell two dots 0.08mm apart from 15cm away, with perfect focus and all? If focus is not needed the dots can be even tighter?
I used to develop iOS apps, and it was common for me to work at a pixel level on a Retina screen. Even in those conditions I can still see the pixels perfectly. Call me a trained eye if you want to, but since the cutting edge display technologies are for trained eyes anyway, there is a reason to go above and beyond. You may say that type of monitor is excessive, but for me it can be something bare minimum to work with so I can see the app UI 100% pixel perfect and still get about the same physical size.