Transformers are very flexible.
A 120/60 volt transformer will work fine with 100 or with 140 input voltage. Howewer, the output will also change by the same percentage. (Around nominal voltage)
The current is more important, since this will determine the temperature, and the temperature will determine if the wire isolation stays above several mega ohms.
The frequency is also important, if you deviate fron nominal frequency, derate properly. You can't run a transformer on 10 hz, since it will have too much resistance. You also cant run then on 1KHz, since it will have too much inductance. Impedance stuff.
You can use transformers for measurement. They will work fine on lower or higher voltages. Within practical limits. You can't double the input voltage, maybe x1.2.
Transformers also have a high voltage droop under load. Open terminal voltage is always higher than nominal, and nominal load is either a bit lower, or spot-on nominal voltage.
Sometimes the open terminal voltage can be double the nominal voltage!
You cannot magically increase the VA's.