I seem to be stupid and don't understand something here, maybe someone can explain.
I have a signal like in the picture.
I used Cursor feature to measure period T, which is approximately 50 ns. This is also proved by the grid: 2 cells 25 ns each is 50 ns.
Now, frequency F = 1/T, which is 1/0.000000025=40Mhz.
So, why the oscilloscope shows 20Mhz, twice less? What is the real frequency in this example?