This is the kind of question that cannot be answered with knowing your needs. But we can provide information to help you decide.
First, I would second the suggestion of several above to find if your power supply has remote sense capability. If it does take time to understand what that is. Simply, it measures the voltage at your output point and adjusts the supply output to compensate for any voltage drop in the lines.
Second, you can either do your own Google search or look at the attached table to find the resistance of the wires you propose using, and then use Ohm's law to find how much voltage loss you will see under your proposed operating conditions. Looking at the table, with your five amp maximum current, 2 meters of 18 gauge wire (approximately 1 mm diameter) would provide about 0.2 volts drop, which would be fine in many applications, but cause problems in others. Going to 12 gauge wire (approximately 2 mm diameter) cuts this by a factor of roughly 4 to 0.05 volts.
Only you can decide how small a drop is small enough, but 0.05 is small enough for almost all applications.
Finally, you should spend a moment worrying about the ampacity of the wires. That is: How much current can they safely carry. The wire sizes mentioned above are fine, and if you have reason to use smaller and can accept the voltage drops (or use remote sensing) you should do some googling on the proposed size to make sure you won't start any fires.