You have at least two choices.
You can use a small resistor (current shunt) to measure the voltage drop across it and optionally amplify this value to be within a specific range (0..5v for example). In this case, you would need to use a third voltage sense wire after this current shunt and constantly read the voltage there and adjust so that the output voltage will be as desired. The more current used by devices, the higher the voltage drop on the resistor so the output will be lower, so your regulator will actually have to output slightly higher voltage.
If super accuracy is not that important, you can use one of those hall effect chips, which measure the current without a resistor (there is some resistance but it's somewhat constant at 0.01 ohm or something like that so negligible ) ... for example see ACS712 :
http://www.digikey.com/product-detail/en/ACS712ELCTR-05B-T/620-1189-1-ND/1284606 They'll output a particular amount of voltage depending on how much current flows through the wire, without affecting the voltage on the wire so you could read the output voltage of this chip and use a small microcontroller or something to convert that output voltage to Amps (with a microcontroller you could also "calibrate" each chip and correct the readings in your microcontroller to get a higher accuracy).