Every power source has an internal resistance. Imagine an ideal supply set at 10v... if the source impedance or resistance is 0, infinite current can flow through your source/supply, and no matter what load you put on it, the voltage will stay at 10. This, however, is never the case. Every power source, battery, power supply, ham radio etc has a source impedance or resistance that can be modeled as some Ω or R±jX value in series with an ideal voltage source. Typically, this is a low resistance, and for sake of discussion, lets call it 1ohm. This 1Ω (inside the battery/source) will determine the current that can flow out of (and back to) said battery at whatever voltage the the internal chemistry/circuitry provides. Now, lets say you have a 10ohm load. That 10ohm load is now in series with that 1ohm source resistance and a voltage divider is set up. Now, for the same current, the voltage on the load is 10/11 of the ideal source voltage (even when measured at the battery terminals). When the load and source have the same resistance, you will see half the voltage on the load (which can make a battery very hot or destroy a power supply). And just like solar panels, peltier modules, etc, for a multitude of unique reasons, there is often a ratio of voltage to current that is most efficient for a given device which sometimes leads to the voltage and current not changing proportionally with a change in load.
Edit: Do not put an ohm meter across a battery or supply to determine the source resistance, you will destroy it. The way you would do that for circuits designed to have matched impedances (signal generators, audio equipment, ham radio etc) is to vary the load while monitoring the voltage and seeing what resistance makes the voltage half of the unloaded value. This shouldn't be an issue for someone working with DC electronics though, and if you run into this issue, simply get a battery/supply rated for a bit more current than you intend to use at that voltage.