I tried to hook up a multimeter across the termainls of the battery to check the internal resistance, but it wouldn't read?
..
Why isn't the multimeter capable of measuring this? Is it as simple as, the multimeter only outputs a few volts and so the difference in potential prevents the meter from taking a reading? DMM was a Fluke 177.
The Ohm mode can only measure "naked" resistors.
Instead, a battery is a voltage source in series with its internal resistance!
This 12..14.4V battery voltage will definitely disturb the Ohm measurement of every DMM.
In the case of more simple instruments, such an abuse might also destroy the DMM.
Fortunately, a Fluke DMM is mostly safe against such misuse.
Therefore, as you describe yourself, the only way is to measure the voltage drop on the battery poles when connecting a known current load. The internal resistance then is the derivative dU/dI = R (differential resistance).
You can do that easily in the car. Measure the pole voltage, and switch on the headlights, when the engine is NOT running.
The light bulbs have usually 110W in total, that are about 9A, and then simply divide the voltage drop by these 9A, and you're done. A healthy battery should have a few milli-Ohm only. So the voltage drop should be < 100mV @ 9A.
The other, more important aspect is the battery capacity, which can not be measured so easily.
Btw.: To charge the battery, you have to apply 14.4V constant voltage.
12V will not charge your battery at all!
The charging current has to be limited, but the internal resistance is so low, that doesn't do the job!
Therefore, you have to limit it otherwise, either by a power supply with current limit, or by an external power resistor.
Example: Limit to 5A, with 14.4V / 80W PSU => (14.4 -12)/5 => external resistor should have about 0.47Ohm, 12W
Frank