Seems irrelevant enough to have a fight over. If the original spec says supply voltage is 5V, it will work with 5V or 5.17V or 4.9V unless the design is total broken shit. And if it fails at 5.00V and seems to work at 5.17V you are not having basically any margin at all so expect it to work only slightly better but still not reliably.
If some of the modules draw significant current, then it's well possible the internal power distribution (wire gauge, card edge connectors, backplanes, PCB trace widths, whatever...) is not up to this and you sure can compensate for the voltage drop by increasing the voltage at the source. This is exactly what many USB power supplies do, outputting 5.2V so that you have at least 4.8V or so left when you combine a crappy cheap cable and a device which charges at significant current.
With increasing voltage, comes risk of overvoltage. 5.17V isn't too much for any sensible 5.00V rated system, but what about accuracy of that regulator? You set it at 5.17V, how much will it drift e.g. due to temperature? Probably not too much but it's worth thinking about. Realistically, you could start seeing problems when beyond 5.5V, so there is still good margin left, but sure, you did eat part of that margin.
If your practical experience is that setting voltage to 5.17V instead of 5.00V decreases weird problems it is well possible you are right, but that tells a lot about the design quality of the products you are working with.