No, this is totally false and in the vein of what you are saying is actually backwards. You may or may not have better clarity of vision than average. That is a different measure. Having 20/15 vision just denotes that you are a bit farsighted. This makes you worse at appreciating the resolution of a cell phone or computer screen than someone with 20/20 vision.
In practice, jitter is caused by unexpected delay in scene generation, typically because of storage I/O. That scene will be eventually displayed, but much delayed from the point in time it was supposed to represent. The study I linked to shows that even at 50ms (one twentieth of a second), the smallest jitter they tested, such a jitter affects the player results.
No it's not a limit, it's average vision. It was something I was told a long time ago and it was about the ability to distinguish the dots not neccessarily focus on the dots. I suppose the other way of expressing it is the shortest break in a line that one can clearly see. so at 4 PPmm I think we are fairly safe to say that that is a good resolution and much more is just an overuse of technology.
Yes you can gawp at a single pixel if you like. But if you are reading text 4K is plenty to make up enough smoothness to trick your eye as you are not dwelling on individual pixels but the overall image. For moving imagery you stand no chance of pinpointing a single pixes even at full HD.
What are you on about? what pixel errors? so you go through every possible frame of a video game to check for "bad pixels", give it a rest.
Dead or always lit pixels on a display are noticeable/irksome because they cause a continuity error, something that the perception part of our brain is wired to auto-highlight.
Similarly, it is a different thing to be able to detect single-pixel continuity errors, than actually perceive things at that resolution. Different parts of the brain involved.
Quotebut one mouse has a 20ms input lag
We're not talking input lag (and I chopped some discussion of why not from here - happy to post it if you're bored ). But if we were, I think you've missed that these would be separate machines and, hence, not synced to each other. As we saw from the previously posted paper, a typically useful (for testing) game has a 10ms tick, and a 60Hz refresh gives less jitter than that (I assume - anyone disagree?). That's ignoring stuff like propagation delay and round-trip times, etc.
A 10Hz display, on average, will take longer to display the appropriate action on screen compared to a 100Hz display, no?
I'm trying to make the comparison as simple as possible.
Don't make it so simple that important factors are left out. For instance, the action tick is assumed to be much faster than the display refresh, but that's not necessarily the case. If you can use a silly 10Hz value for display refresh, we can have a silly 100ms value for action tick to match. In that case, the 100Hz monitor doesn't give you much (kind of - action at the start of a scan might appear to occur sooner than a parallel action at the end, but that also assumes a linear display scan that lasts for the refresh period).
How did this thread on Linus Tech Tips Video Production youtube quality turn into a thread on monitor refresh rates and delays?
How did this thread on Linus Tech Tips Video Production youtube quality turn into a thread on monitor refresh rates and delays?
will on average still show up faster on the 100Hz monitor
How did this thread on Linus Tech Tips Video Production youtube quality turn into a thread on monitor refresh rates and delays?
the writing/management for production is an art/social-science (and maybe psychology)
the equiping is tech and science
the viewing is biology and psychology (I tried a bit of this part)
the business and marketing ...
how youtube processes the video is another pot of ...
plenty to go around
For what it is worth, BrianHG, I've read every message in this thread from the "what is significant in video/display" angle. Everything seems to flow pretty well if considering each message from that viewpoint, even if there has been considerable disagreement.
Quotewill on average still show up faster on the 100Hz monitor
Sure.
And even faster on a 1000Hz monitor, so why are we not all lusting after those? There is a point where it doesn't actually matter any more. Clearly, the silly 10Hz is not there and presumably your 100Hz is. Where is the point between those two markers where the gain isn't worth better kit?
When they do at a reasonable price, people will buy them
QuoteWhen they do at a reasonable price, people will buy them
Undoubtedly. People will buy anything if it had a bigger number than the old model. But are you seriously saying that a 1000Hz monitor will improve your experience over, let's say, a 500Hz monitor?
at 1000hz, a frame to frame dithering pattern
Quoteat 1000hz, a frame to frame dithering pattern
That's an interesting (not to say legitimate) take on it. Kind of sidesteps the point being argued, though
QuoteWhen they do at a reasonable price, people will buy them
Undoubtedly. People will buy anything if it had a bigger number than the old model. But are you seriously saying that a 1000Hz monitor will improve your experience over, let's say, a 500Hz monitor?Maybe not in a frame rate sense, but, it the display has a limited number of shades, or if it is only 4-6 bit per color, at 1000hz, a frame to frame dithering pattern will allow a reproduction of any missing luminance values. This is particularly useful with superbright LED displays as the darker shades are just too huge a step since the output is a linear step while our eyes work logarithmically. This is usually needed for huge LED wall type displays where you want an above 240Hz refresh so that outdoor signs don't flutter in direct sunlight as people drive by, but, that refresh speed lowers the amount of available PWM shades.