This is a major, potentially show-stopping question about non-OEM FPGA software that I've never seen addressed.
FPGA manufacturers' software contains a complete timing model of the die. During fitting, every path is analysed to ensure it meets the worst-case timing specifications for the silicon, under all corner cases for voltage, temperature and process variation. Once fitting is complete, the timing analyser can produce a specification for Fmax for each clock, compare it against the requirements you've supplied in the .SDC file, and give a 'yes/no' indication as to whether or not the device is guaranteed to function at the necessary speed.
Without that timing model, it's impossible to know for sure what clock frequency a given design might be able to run at. Even if it works in one device on the lab bench, there's no guarantee that another device from another batch will work without error, or that any device will continue to work when it gets hot, or cold, or if the core voltage varies within the limits of the spec.
Worse: there's no way to ensure that hold time requirements are met, and these don't depend on the clock frequency. Without the ability to check the hold time, it's impossible to know that a given design will work reliably at *any* frequency.
Timing specifications are such an integral part of FPGA design, it beggars belief that anyone might consider trying to reverse-engineer the configuration data format for an FPGA and produce 3rd party tools without first establishing how timing analysis will be done. Without knowledge of the manufacturer's guaranteed timing specs for the die, the best you could ever hope to do would be to demonstrate a device apparently working under lab conditions, but with no way to know if it'll ever work when mass produced.
Ask the tool's author where they got the detailed timing model of the FPGA from, and how they use it. I'm curious to see if you get a satisfactory answer.