Will it decrease my runtime?
80A * 0.1V = 8W dissipated in cables
That's roughly 1% of the total power draw - meaning that if you completely eliminated cable loss, your gain to runtime would be around that.
Doesn't seem worth spending any more on thicker cables to me!
In addition to this 1% loss, low-voltage cutoff happens 0.1V earlier than otherwise - as a crude approximation, this is about 5% difference in state-of-charge (i.e., assuming linear 100% = 13.0V, 0% = 11.0V), so expect about this much loss in the runtime.
Also, according to the wikipedia table, 3AWG wire has ampacity rating of 85A at 40 degC temperature rise.
So you are at minimum cable sizing right now, with some (approx 5%) loss in runtime, and cables that will run quite warm, but not dangerously so. It's OK, but I might consider going one or two notches up in size. "1/0" size is approaching overkill.