I've found it is easier to understand how battery power works once you realize that capacity, in itself, has nothing to do with voltage. When we say that a battery that has a certain number of "mAh", that means that it stores a certain amount of charge that you can drain faster or slower depending on how much current (i.e. charge per time) you are pulling out of the battery. The current is the only thing that matters as far as the battery is concerned. A 3000mAh battery will be drained in about 3 hours of sustained 1A current regardless of whether the battery or pack is 1.5V, 24V, 120V or whatever.
The battery's voltage becomes important when you take into account that the battery is powering a load that has certain voltage and/or current requirements. If your load needs 5V, for instance, you'll need to place a regulator between the load and the battery, and the current drawn from the battery will then be greater or lower than the actual current your load needs, depending on whether the regulator has to boost or buck the battery's current voltage to get to the 5V. More specifically:
I_battery = I_load * (V_load / V_battery)
and since runtime depends on I_battery:
runtime = capacity / I_battery = capacity / I_load * (V_battery / V_load)
Thus, the battery voltage participates in the runtime calculation only in so far as the battery has to be adapted to a load that has a different voltage requirement. Note that you can have circuits that run directly off the battery's voltage, but then they will have to tolerate significant changes in voltage (up to 50%) over the course of the discharge, and they will likely not use the battery with full efficiency.