I think at 5A+-10mA, the complications may not be necessary.
(1) Say you use a 0.1ohm resistor, at 5A, the burden voltage is 0.5V. 0.5V is a lot, so probably a smaller resistor is preferred. But even at that 0.5V, you are talking just 2.5watt. Even a small 1" fan can cool your shunt adequately well. To get a lower burden voltage, say running a pair of 0.1ohm in parallel, you are down to 0.25v drop and merely 1.25watt. Get 4 in parallel and you are down to 0.62watt heat and mere 0.125v drop.
(2) If you are going with the trouble of calibrating, getting 0.2% accuracy with even a lowly ATMEGA328 is doable. If you get a reasonable 12bit adc, that would make the job a lot easier. But for good calibration, it takes patience - lot of it.
(3) I don't think individual component accuracy is as necessary as consistency. So what if 2.3V across the shunt is measured as 1.7v, as long as it is the same all the time, your MCU can translate that with good calibration data. You need resolution (more bits from the adc) and consistence.
I made an ATMEGA328 based volt logger not so long ago. I use the UT61E to calibrate the volt logger. If I do not change/unplug/touch the test-lead wires between calibration and verification, I can get around 0.2% "accurate" from about 50mV to 200mV readily. When I plug/unplug test leads, that change in the contact resistance/quality introduces 1-2mV error. My 0-200mV range is OpAmp multiplied to be 0-5V for the ADC. Accuracy here means agreeing with my the device (UT61E) I used to calibrate the ATMEGA. If I am to solder the test-lead directly to the shunt as oppose to a plug-in wire, getting below 0.2% error 50mV to 200mV is rather doable even with the lowly ATMEGA328.
I think with a good (consistent) 12bit or 16bit ADC, 5 to 10 0.1ohm shunt in parallel, it should be a cake walk.
Making the calibration table is the key. The more effort there the more accurate the result You can make all kinds of adjustments there with your MCU code and even correct for the non-linear aspect of the OpAmp and %error in the "not exactly 0.1" 0.1ohm resistor.
Calibration I did was to sweep the voltage within the range (ADC=0 to max), log what the ADC reading is, and log the real voltage (in my case UT61E) readings as you sweep. Have the logging program store them into a text file, and make an adjustment table(s) from them. In short, your ADC reading is not a variable to calculate the result but rather an index into the "real reading table". So, say you store ADC=40 and real reading is 4.333v and the system is consistent, you don't need to care what ADC=40 should calculate to based on your multiplier or resistor values. Instead, just look up the stored table for the real voltage. If your average ADC reading is 40.5 for that sampling period, then you interpolate the mid point between the 40 and 41 readings. Works very nice, takes care of any non-linearity, resistors not being exact, so forth. As long as consistency is there, you get back the exact calibrated value. But it is a lot of work to program a calibration scheme and do the calibration.
With my volt logger, it takes me 2-3 days to calibrate each range "quick". I had 7. So, I accepted less than best. I also had to squeeze 20 "calibration tables" in to 32K of flash, so that was a lot of work in working up a scheme there too. I can see spending a week or two to sweep the voltage and correcting the calibration table entries afterward to make fine-fine-and-refined adjustments if you want very very good results.
Rick