USB 2.0 is intended to be 5v +/- 0.25v at up to 0.5A (500mA) so a total of 2.5 watts.
USB 3.0 raises that limit to 0.9A for a maximum of 4.5 watts.
Normally, a computer motherboard or a USB hub is supposed to limit the output to those values but to make things cheaper there's no circuit to limit the output current, or on most motherboards, there's just a basic resettable fuse for a group of 2-4 usb ports, configured for 2A limit (so through any of the 4 usb ports, a device could take up to 2A)
Wires have resistance. If you move a lot of current through wires, there's going to be some voltage drop on the cable.
Typical usb cables have AWG24 wires, which have a resistance of 85 mOhm per meter. For simplicity let's round it up to 100 mOhm or 0.1 ohm.
Now, imagine you have a 1m long cable between your phone charger and the actual usb port on your device... this means there's 2 meters of cable completing the circuit.
So you have the basic formula Voltage = Current x Resistance.
At 100mA (0.1A) of current the voltage drop on the 1m usb cable will be Voltage = 0.1A x (2x0.1 ohm) = 0.02v , so if your phone charger outputs 5v, you get 4.98v at the end of the 1 meter cable.
At 500mA (0.5A) of current, the voltage drop would be Voltage = 0.5A x 2 x 0.1 ohm = 0.1v so your charger output 5v but at the end of the cable you get 4.9v
Note that there is also some resistance where the connector on the cable and the connector on the device mate (make the connection) but that resistance is usually below 10mOhm (0.01 ohm) so it's usually ignored.
By default, a phone would normally take up to 0.5A from a USB 2.0 port, as the specification says. That's why the phone charges slowly when you try to charge it from a computer port, because that's a genuine USB port.
The phone chargers (adapters) resort to some tricks to tell the phone that they are capable of giving more than 0.5A to the phone.
Basically, they know the data wires in the usb cables will never be used to transmit data while the phone is charged, so they use these wires to send a couple of voltages to the phone, and depending on the combination of two voltages, the phone can determine if it can take more than 0.5A. If there's no voltage on those wires, then it sticks to 0.5A.
Also you should know that more modern phones support QuickSync, which basically works by having a chip inside the phone charger. The phone can then talk to the phone charger and tell that chip that it may change the voltage from 5v to 9v or 12v in order to charge the battery faster. If the charger doesn't support QuickSync or there's no chip at the end of the cable (as would be the case when you plug the usb cable in a computer), the phone would stick to only 0.5A
edit: Have a look at this, scroll down to see various ways of indicating maximum charge current :
http://lygte-info.dk/info/USBinfo%20UK.htmlHowever I bought a standard USB A socket and wired it to a breadboard and powered it direct from my Tenma bench supply. I started at 5V but noticed the USB analyser reported only 4.85V when on the wall wart charger it indicated 5.00V steady. So I wound the voltage up to 5.20V on the PSU and got 5.00V on the analyser.
This is when things were weird. The mobile phone didn't like this input, it was highly suspicious of it. The current would rise to about 200mA and promptly drop out again for a second and then come back up and it fluctuated about almost randomly like the phone was testing and sensing the 'charger' voltage and rejecting it. Phone just didn't like it, although it didn't report anything other than "Charging off AC". It thought it was an AC adapter as the D-/D+ where disconnected. Other things worked fine on it.
As I explained above, your wires were probably already too thin, to have the voltage drop from 5v down to 4.85v with just 200mA. The breadboard contacts probably added more resistance, lowering the voltage more.
The phone probably saw that voltage 4.85v and that is within 5v +/- 0.25v and then when it tried to take up to 500mA, due to resistance of the wire and the breadboard the voltage probably dropped outside 4.75v .. 5.25v range, which made the phone stop charging.
You should measure the voltage with a fast multimeter or an oscilloscope and see the voltage exactly when the phone charges.
The 100nF and the 330nF are the minimum required for the regulator to work. The datasheet assumes you have the power supply near the regulator, not a meter or so away on the desk.
It would help to add an electrolytic capacitor on the input (let's say 10uF or more, any value would be good but above let's say 470uF it wouldn't make any difference) and an electrolytic capacitor on the output (let's say up to 100uF) or at least 1uF ceramic capacitor.
A linear regulator produces 5v by dissipating the difference as heat. So, you have 7.5v in, 5v out and 0.36A ... that means (7.5v - 5v ) x 0.36A = 0.9w produced at heat. If you look in the datasheet, you will see some values like 30c/w over ambient... this would translate in "for every watt dissipated, the temperature of the chip will increase by 30c above the room temperature" and another value in the datasheet will tell you what's the maximum temperature the chip can handle (usually 105c or 125c or somewhere around that value) ... basically if your temperature is above 80-90c you're supposed to add a heatsink.
The low current could be caused by lack of capacitance on the input and output, and fluctuations on the input voltage caused by the thin wires you used. Some capacitors on input and output would help a bit. Thicker wires would also help.
Also note that as a battery charges, there's less current being used to charge. So at 20% the phone may charge with 700mA but at 40%, the battery may take in only 500mA or less.