Thanks for the reply floobydust.
The input circuit is indeed recurring in many freq meter designs.
It can be traced back to the original Motorola App Note AN-581 for their ECL logic families, dating back to the late 1970's.
It is used in a 1GHz meter in Silicon Chip Australia of Nov 1987.
The 50MHz one you screencapped can be found in Silicon Chip Australia from Oct 2003.
As mentioned, also used in the 2.5GHz on in Silicon Chip Australia from Dec 2007.
And it was used again in the 55MHz meter in Silicon Chip Australia from Aug 2016.
The difference between all the above versions are mostly in the values of resistors and caps: the parallel input cap (between 18pF and 68pF), the values of the resistors on both side of the Offset Adjust pot (1K and 2K2), and the feedback resistors on IC5c Schmitt trigger (100/100ohm or 220/330ohm). The Dec 2007 design seems to deviate the most from all the other ones. Instead of resistors in the hundreds of ohms range, they are now 180 ohm and 1 Kohm. There is also an errata for the Dec 2007 which changes the two 470 ohm resistors to ground on IC5c pins 2 and 3 to 360 ohms.
As per my previous suspicion and as noted by Emo/Eric, it seems like overdriving the input circuit is definitely contributing to the problem. I used a resistor divider to bring the input amplitude to <250 mV. Situation improves for lower frequencies (<100 kHz) but not for frequencies between 1 and 10 MHz.
(In the meantime, I put the PCBs in a box made with 4 grounded single-sided copper-clad PCBs to shield them from EMI ... but no improvement at all.
So need to continue.
The specs mention the sensitivity for the Channel A is <20mV below 20MHz, <75mV below 100MHz and <250mV above 100MHz.
Was hoping someone who also built the unit may have had similar experiences and was able to resolve them.
Or maybe someone who is using it can confirm the meter needs relatively small signals to work properly.