Your regulator needs 1V of overhead. That leaves about 2V extra. If you were actually drawing 800mA, the series power resistor should be 2V/0.8A = 2.5 ohms. Wattage is 2V x 0.8A = 1.6W. So you would want a 2.5 ohm 2W resistor.
At full 800mA load, the resistor will be dissipating 2/3 of the heat. The regulator will be dissipating only 0.8W. You just cut the thermal load of the regulator by 66%.
But at only 400mA, the series resistor will only drop 1V. So the regulator will be dropping 2V at 400mA. It is still dissipating 0.8W. Which, without a heatsink, is still Hot. If your expected max current draw is only 400mA, you can use a 5 ohm resistor with suitable rating. If you have intermittent burst draw over 400mA, you can beef up the output caps as a compromise. This will get you down to 0.4W.
Basically, choose the lowest max load that will work for you, to get the highest resistor possible. This will probably work out better than the 5V regulator... which takes roughly 1/3 of the load off the LDO over any current draw. (A bit more under some circumstances, but not much).