If you use a voltage source and you set the voltage at 50V, the current through the 10 ohm lamp will be 5A and through the 20 ohm amp it will be 2,5A
I don't understand this. The more power-hungry the lamp, the less amps it needs? Seems like the wrong way round?!
Or is it the case that the amps are what's left over after work has been extracted by the resistance?
No, this is bass ackwards!
Current is a result in most cases. A lamp will have a rated voltage and a rated power. Let's say 100W at 120V... So from P=E^2 / R (E squared over R), we can calculate that the hot resistance is 144 Ohms. We can calculate the hot current from P = I^2 * R or I = sqrt(P / R) or, in this case, 0.8333 Amps. We can also use P = I * E to get watts so 100 = 0.8333 * 120 (which it does). All of the manipulations of Ohm's Law should be consistent, P, I, E and R are all tied together.
So, what is really happening? We apply a voltage to a lamp. The internal resistance (assume hot) connected to a certain voltage will produce a certain current flow. That current flow times the voltage will result in a certain amount of power being dissipated. Mostly, it will come off as heat in the case of incandescent lamps.
What I have avoided saying is that the incandescent lamp changes resistance as the filament heats up. The value is much lower when cold. You can just grab a 100W lamp and measure the cold resistance if you are interested. I suspect it will be quite low and nowhere near 144 Ohms. But I don't want to go there...
If you really want to chase the topic down a rathole, calculate the instantaneous current when you apply 120V to that cold filament resistance. There are a lot of reasons why the current may not go quite that high but it does explain why lamps burn out when they are switched on.