The 0.35 rule of thumb is based on how some scopes' front ends interact with signals; that's all.
The 0.35 rule of thumb is calculated from the bandwidth of a single-pole or cascaded single-pole response, which is not really Gaussian but close enough, resulting in a 10% to 90% transition time. It is not only based on the front end of some oscilloscopes.
The problem with the simplistic rules of thumb based on period are that, while strictly true, they actively and repeatedly mislead beginners with respect to important practical signals found in many systems.
The math is not misleading, but manufacturers are. If you are not marketing or sales, then you are overhead.
The 0.35 rule is still useful to estimate bandwidth from transition time for typical linear systems. It does not apply to non-linear systems, like when slew rate limits are reached in a Rigol DS1000Z series oscilloscopes, or any sampling oscilloscope. This distinction could be important here but we can test it. Does the transition time of the square waves change with amplitude? If so, then the 0.35 rule is not going to apply, and bandwidth measurements are going to change with amplitude.
Other than a sig-gen and scope, the OP hasn't specified a system so how could we know what is or isn't important? But, since he at least has a scope, think about why scope calibrator outputs are square waves. If I were using a typical 100MHz scope with a typical calibrator output with a 5µs rise time and sharpened that up to 50ns, would it matter?
It should not; the calibrator output is intended for low frequency compensating attenuating probes, and whether the transition time is 5 microseconds or 50 nanoseconds will not matter. Some calibrators produce fast edges and some produce slow edges, but in both cases they have low aberrations. Calibrating the high frequency response requires a test signal which is considerably faster.