ok, you'll probably get lots of answers.
1) LED's are, as their name says, diodes. Diodes only allow current to flow one way.. unless the voltage is high enough to cause reverse breakdown which can damage the diode, unless it is specifically designed for it (zeners). It is a semiconductor, the consequence of which means the voltage across it must be over a minimum value for current to flow, this is called 'forward voltage'. For silicon diodes this is generally 0.6V - 0.7V, depending on current. For LED's, this can be anywhere from ~1.5V to 3.7V depending on colour/material. So, if you were to provide a constant current through the LED, say 20mA, then LED will have a voltage across it which again depends on colour/material.
If you connect 9V across and LED, the LED will drop say 1.8V for a red one, but the power supply is 9V, so there will be 9-1.8V = 7.2V across the connections to it. Using ohm's law, V=IR, rearranged to find current, I = V/R we can work out how much current will flow. If the connections to the LED are just wires, no resistors, wires still ahve resistance, but it is generally very low, say 0.5 Ohms. Plug that into our formula and we get I = V/R = 7.2V/0.2 = 14.4 amps. Given that most 5mm LED's are rated for 30mA max, it will clearly destroy the LED.
Also if the power source is 9V, you can't really get 14 amps from a 9V battery, so the current will be limited by the batteries internal resistance, but that will still mean >800mA can flow, again destroying the LED. We add a resistor for 'current limiting'. This means, if we want teh LED to be bright, but happy, it needs say 20mA. If it has a Vf (forward voltage) of 1.8V, then we can calculate the resistor required:
9-1.8V = 7.2V. That is the voltage across our resistor. We want 20mA = 0.02A, so again, plug the numbers in... R = V/I = 7.2 / 0.02 = 360 ohms. 360 whilst available isn't a standard value, so rather than drop it to 330, which would increase the current, possibly taking it over the maximum, we pick the next highest resistance, say 470ohm. giving I=V/R = 15.3mA.
In your example, 3v 20mA LED, with 9V, the voltage across the resistor is 9-3 = 6V. for 300 ohms, current flowing through the LED and resistor is I = V/R = 6/300 = 0.02A = 20mA.
If you were to increase your power supply voltage, this would increase current, because it would increase the voltage across the resistor, where-as the '3v' LED will always drop 3V - although this does increase slightly with current, and changes with temperature, but not by much. So say you now use 12V. 12V - 3V = 9V. I = V/R = 9V/300 = 30mA. This is quite high for a standard '20mA' LED (note, the 20mA figure is often for 5mm LED's, but not necessarily a 'standard'). So if youre power supply is going to vary, you should calculate the maximum and minimum current that can flow to make sure your LED doesn't die.
2) 'Current consumption' is fairly easy to calculate for a purely resistive load, like.. a resistor! But with semiconductors, they have fixed voltage drops that doesn't vary much with different current flows, which complicates matters. It is true an LED can 'draw' lots of current, because by itself it does not limit how much current can flow, it does however limit the voltage across it, leaving the rest of the voltage to be taken up across its power connections. As above, it is these connections that should limit the current.
The only difference between a 12V 1A DC power supply, and a 12V 500mA (0.5A) power supply is how much current it is capable of providing whilst keeping its output voltage at 12V. Therefore, you should pick a power supply with the correct voltage, and a current capability that is higher than you require. If you have a circuit that draws say, 250mA maximum, then using a 200mA supply could mean the power supply, drops its voltage (as in the case with mains transformer supplies, batteries) or has a protection circuit that kicks in which temporarily shuts off the power supply to protect itself. If its the first kind, you might get 250mA out, but its voltage might drop to 10.5V.. and that can change how much current your circuit will draw, and perhaps change how the circuit behaves or even stop it working altogether.
There are so many cheap switchng 'wall-wart' power supplies these days, USB is a handy way to power things that you have a lot of choice. What is important is it is regulated, and has the correct voltage you require. As long as its current capability is higher than you need with a good margin for error (12V 500mA is a pretty good choice for small circuits that require up to 400mA) then its fine.
But there is no point in lighting up a few 5mm LED's, that draw 60mA, with a 12V supply capable of 10A. The LED's (and their current limiting resistors!) will still only draw ~60mA, but if you make a mistake and accidentally short the power supply, 10A can melt things.
Too long but I tried to explain everything.