Wrong question again. The current doesn't increase or decrease based in the INCOMING power.
What are you trying to say? The actual input current draw has been experimentally measured by the OP and some others, and it's a result you can't just go and ignore --- and it's a fair assumption that the device is not a shunt regulator as it would be totally insane (a lot more insane than linear pass type regulator). I.e., power to cell = power in minus controller losses, which will be fairly consistent percentage for a switch mode regulator. More voltage AND more current in ---> DEFINITELY more current into the cell, unless the design is totally crazy (a shunt regulator).
The current is regulated by the charging profile for the particular battery being charged. It is a much more complex process than you seem to be imagining. It is based on the beginning state of the charge, the temperature, how fast the battery is accepting the charge, etc. etc. etc. And in many cases there is a microprocessor inside the battery pack which keep tracks of the previous discharge cycle to further optimize the latest charge cycle.
I, as a battery management system designer, would tend to say that it's a lot more simple than
you seem to be imaging. Although, industry has shown over-complex examples, which mostly tend to fail due to wrong assumptions made.
These iDevices do not have a battery pack, they have a single cell. Previous discharge cycle has absolutely nothing to do with any "optimization" of the charge cycle; you are just throwing in fancy weasel words; please stop, it belongs to kickstarer scams, not on this forum IMO. Thank you.
Li-ion is the simplest battery type in use so far; it has practically 100% (typically >99.9%) coulombic efficiency, practically no "memory effects", no Peukert effect. All losses are ohmic, i.e., voltage drops. It is always true that charge in = charge out. Lithium plating does limit allowable charging current especially at high voltages and low temperatures, but if the freaking device actually is measured taking in more charging power, then it's most definitely going into the cell. That may have two conclusions; they don't use 100% of their "allowed" charging current normally; or they are exceeding their "allowed" budget, causing more damage than it's designed for.
It is well possible that a mere 30% increase in charging current may shorten the lifetime of the cell from 500 to 100 cycles if it happens in just the right conditions, but it's not very probable they are so close to that limit.