I don't see why using two IR emitters in series would be a problem, their Vf is 1.6V max, so would still be fine when your 18650 is <3.2V. Of course as your battery voltage decreases, their output will, but this isn't that easy to work out by how much as their forward voltage will decrease with decreasing current. Also, as its a remote, almost all protocols have relatively low duty cycles, both to save the batteries in the remote, and to allow for higher peak power, increasing the remotes range.
For arguments sake, lets say your 18650 is fully charged to 4.2V. And the Vf worst case for the emitters is 1.4V. 4.2 - (2*1.4) / 4.7 = ~300mA through the emitters. That isn't massive wit regard to a lot of remotes. Many common remotes have a single emitter, running off two alkaline cells, with a 10-22R resistor.
One thing I would check, preferably actually test, is the MOSFET's turn on time. I have used several of the modern wonders that are SOT23 MOSFETS with single, or just double digit miliohm on resistance. They *tend* to have quite long turn on/off times, as well has higher gate capacitance (I saying higher because plenty have tiny gate capacitance so work nicely up to hundreds of kHz). The carrier is 38kHz, but for reasons above might not be 50% duty. Say its 25%, for a carrier period of 26.3us, the pulse width is ~6.6us. As the GPIO will be limited in its driving capability that 6.6us has to take into account not just the turn-on delay, but the gate charging.
You could probably simulate it in LTspice, as there are plenty of MOSFETs models in there. Ultimately, though, its not like you're stunting amps here, or constantly transmitting for minutes at a time - I can't see anything popping, it just might produce a carrier with a low duty cycle. But things to think about! If in doubt, knock up a dirty prototype and test/measure. Discretes are so cheap you probably have to order 10 at a time anyway.