The voltage capacity of a wire is only limited by its insulation. A lot of wire is insulated to handle a few hundred volts, and it's sometimes possible to exceed this if you space things out and use air as an insulator (but consult local codes and safety regulations for details).
Suppose you had a pair of wires capable of carrying 5 amps. Let's say that, over the length of the run, they had a resistance of 0.1 ohm. The power dissipated in the wires would be I^2*R, or 2.5 Watts.
How much power could they deliver to a load? At 5V, they could carry 25 Watts. At 12V, they could carry 60 Watts. At 200V, they could carry 1000 Watts. In each case, the current would be 5 amps, and you'd have to put an additional 0.5V (2.5W) into the input side to deliver the full 5 amps at the rated voltage to the load.
The higher the voltage, the greater the power that can be delivered with a given size conductor. And the lower the losses are in the transmission line, when figured as a percentage of the load that's delivered.
If you play with the math and work through a few examples, you'll quickly understand why relatively high voltage at relatively low current is what's favored for sending power over long distances.