Unfortunately, since I moved back to the US about 5 years ago the one thing I have been convinced of is this: That people can literally convince themselves of anything. I'm a skeptic because I'm forced to be. I also totally believe that you feel like you see a difference. What I'm trying to ascertain is at what actual rate is that difference seen. So far, I'm still the only one that provided data. I don't think my personal opinion really counts, nor has anyone in this thread convinced me theirs does either.
Nctnico related his experience in an improper experiment setup and I completely can see the validity of that. I also realize that there is not going to be a perfect number, and there are going to be some outliers (I would expect the younger the person the faster they may be able to process the visual data as well).
Regardless there is going to be an average limit, to our eye/brain where buying a faster monitor truly buys you nothing, and it is just a marketing ploy. Is that at: 77Hz, 90Hz, 120Hz, 144Hz, etc? Your experience with FPS vs scientific research just isn't going to do it for me. If 80Hz was your hypothetical limit, and unless you used monitors in multiple ranges of refresh speeds between 60Hz-144Hz, how would you actually know that 144Hz is better than say 81Hz? I'm not trying to condescend, and if you feel personally attacked by this: "science", well not really sure what to say to that. You've obviously convinced yourself that your monitor is superior, so just go with it and be happy. I'm just trying to learn.
The research data does already show that 144Hz is better than 60Hz because we can detect images faster than 60Hz. So maybe that is all that should really matter here for most. Personally though, if 80Hz was fast enough I would rather use less GPU time. It is probably a moot point if most of the monitors on the market faster than 60Hz are 144Hz though.