Ok, you need to read the data sheet's worst case drift on the IO performance and do a little math.
In my example case, I have 16 tuning steps until my clock rotates 360 degrees.
MAX10-6 FPGA...
At 250Mhz, 7 of 16 steps give me true error free data. (ram underclocked)
At 300Mhz, 7 of 16 steps give me true error free data.
At 350Mhz, 7 of 16 steps give me true error free data. (fpga CK/DQS/DQ IOs buffers overclocked.)
At 400Mhz, 6 of 16 steps give me true error free data.
At 450Mhz, 5 of 16 steps give me true error free data. (fpag DDR3 core and write data DQ serdes overclocked.)
At 500Mhz, 5 of 16 steps give me true error free data. (fpag read data DQ serdes over clocked at this point)
With this, at 500Mhz / 1gtps, it is possible to calculate the # of picosecond play of valid data I get when tuned in the middle and how much each tuning step gives me.
Also, there is additional error 1 tuning point at one end where around half of the 16 bits are correct, the rest are jiggling.
It is this '1' tuning transition point which should give you the idea as the timing errors between the 16 bits.
Sorry I cannot test above 500Mhz, the Max10 completely fails to do anything. The DECA board I used has a single 800MHz/1600mtps 16 bit DDR3 ram chip.