IMHO this is one of the situations where something works counter intuitive.
The maximum amplitude for an amplifier (lets say a signal source) with a given bandwidth is: slewrate/(2*pi*bandwidth) (*)
If we take a look at the risetime we would need a 100kV/s slewrate for the 1V/10ns square wave and a 100MV/ns slewrate for the 1000V/10ns square wave.
If we use the formula to calculate the bandwidth for a given slewrate and amplitude then it becomes: bandwidth=slewrate/(2*pi*amplitude)
Fill in:
bw1=100k/(2*pi*1)=15.9kHz
bw2=100M/(2*pi*1000)=15.9kHz
So the amplitude of a signal has no influence on the bandwidth. I also did some simulations with a 10MHz signal (which is closer to the frequencies of interest) to check the math but there is no difference in the frequency spectrum.
Ofcourse the amplitude of the 1000V square is much higher so relative to the 1V square the amplitude of all the frequencies present in the signal will be 1000 times stronger.
(*) More math here:
http://www.ece.uprm.edu/~mtoledo/5207/2011/bw.pdf