If it's a switching power supply, then you most likely won't be able to alter it to such big change. Most switching power supplies have components like the transformer that are custom designed to work best at some output voltage and up to some current... the design may allow for let's say up to 20-30% up or down around the default voltage, like let's say you could adjust your 20v power supply between 16v and 24v ... and maybe the power supply would be able to supply up to 6A
Any bigger changes would require changing the number of windings on the primary and secondary side of the transformer, changing other parts.. and so on..
With classic transformers, it's just easier to design a transformer to output a higher voltage. A 20v AC 5A transformer may output 20v +/- 5% , while a 5v AC 20A may output 5v +/-10%, and most likely the output voltage at very low loads like let's say 0.1A will be much higher, let's say 7v AC.
It is preferred to use higher voltages because of losses in transmission. The higher the current, the more losses.
You have the standard Ohm's law V = IxR or Voltage equals Current x Resistance.
So let's say you have a power supply that can output either 20v 5A or 5v 20A and you have a 1 meter pair of AWG18 wires between the power supply and your device. AWG18 is the standard thickness of wires used in computer power supplies.
You know that AWG18 wires have a resistance of approximately 21 mOhm (0.021 ohm) per meter, according to wikipedia :
https://en.wikipedia.org/wiki/American_wire_gauge#Tables_of_AWG_wire_sizesand between power supply and device there's gonna be two meters of wire, so the total resistance will be 42 mOhm.
So using that formula
at 20v and 5A, you're gonna lose V = 5A x 0.042 = 0.21v on the wires between psu and device, so your device actually "sees" 19.79v
at 5v and 20A, you're gonna lose V = 20A x 0.042 = 0.84v on the wires between psu and device, so your device actually "sees" 4.16v
So you can see at high currents, there's big losses in the wires... computer power supplies counteract that by using multiple pairs of wires... for example a 6pin pci-e connector that's standardized for 75w of power, uses 3 pairs of wires to transfer 12v at up to 6.25A to a video card (12v x 6.25A = 75w) ... so each pair of wires only has to carry around 2A of current, making losses in wires very small.
Higher current also means you need beefier connectors - for example you can't use a USB connector for 5v and 20A, you wouldn't even be able to use a basic barrel jack connector, as most can probably only handle around 6-10A max.
You also have to be aware of how connectors would handle insertion and removal because when you have 20A and you plug something, you could have sparks and the metal of the contacts could pit from the sparks
You have to be aware of the current rating of connectors ... for example mini-Fit Jr. series from Molex (the connectors used for pci-e on video cards in computers) have a rating of 9A per pin ... it would not be smart to transfer 20A through a 2x2 connector of that series, as your 20A exceeds the recommended maximum of 2x9A = 18A