A several-frame latency would of course make things even more obvious.
But I can assure you that you can perceive a difference, depending on each individual, down to a couple ms when the image is moving so fast that each frame is different enough from the other. Again that's because of our ability to sense differences much faster, but much less accurately than static events. Of course we can't "see" individual frames, but we can perceive a change happening pretty quickly. Same with many other species. Probably to optimize the detection of a danger rather than getting an accurate analysis of it.
Of course that means that the underlying video stream would actually generate video with changing frames faster on the faster refresh rate. If you're feeding a 144Hz display with only 60fps or so, it's not going to be any better. It's probably going to actually look worse.
Except for the decreased latency and usually shorter persistence that comes with panels with higher refresh rates, that you can notice, if the computer only streams video to it with actual frames that are not updated as fast as the refresh rate, it will not make a difference obviously.
So yeah pretty much the only applications where you're bound to be exposed to this difference are games. Keeping the following in mind though IMO:
First, the game in question must be able to generate frames at the elevated refresh rate. With modern games and high resolution, this would often require a monster graphics card and CPU, and a very well written game engine.
Second, and probably most important for the average user: even though many people, as I said above, are able to perceive a difference with increased frame rates when looking at fast moving images, when they are focused playing a game, which is usually a pretty intense focus, they won't necessarily be able to notice any difference because their brain will be focused on something else entirely. So whether it makes a difference in real use cases, is, I admit, questionable.
Apart from how we perceive fast moving images (with movements appearing possibly smoother with higher frame rates, etc), latency itself would make a difference.
Many people can definitely perceive even a 16ms latency between a physical action on a button and the result of this action on screen. Well written games have rich graphics effects that help make this latency not very noticeable though. Tricks!
And then you can expect an overall latency when playing a game on a typical computer, between the action on a game pad, and the reaction on screen, that is significantly higher than just the display latency itself, so again here... yes this makes an objective difference. But will you notice it?