I have an equipment that takes 25W of power to run.
If I run it with AC 220 it is expected to drive 0.11Amps
If I run it with DC 12v it is expected to drive 2.08Amps
I based the current consumption on I = P/V
Is that correct? Higher the voltage less current it will need to run?
If you have, say, a domestic 220v incandescent lamp like we used before LED ones became available, & try to operate it from 12 volts DC, it won't work at all.
If, on the other hand, you have a 12v automotive tail lamp & connect it across 220v ac, it will explode!
Looking at your original figures, 220v, 0.11A, using Ohm's Law in the form R=V/I, we have
R=220/0.11, so R=2000 ohms.
Now for the case where V=12v
Applying this value of R to Ohm's Law in its more usually quoted form ,I=V/R we have I=12/2000
=0.0060A, or 6mA.----- a far cry from 2.08A!
By the way, the power will be 0.072 Wattts.
I won't work out the tremendous current that your 12v device will draw from the 220v supply, but the lesson from this is that
the same device cannot be used at such widely differing voltages.
Yes, a 25W device
designed for 220v would draw 0.11A, & 25 W device
designed for 12v would draw 2.08A.
The sting in the tail is, that if we want to use the 12v device on a 220v supply, or the 220v device on 12v, we will need some sort of voltage conversion device, which will not be 100% efficient, so for instance, your 12v device plus a conversion device will draw more than 0.11A from the 220v supply.
In the other case, the conversion from 12 to 220v not only includes a voltage step up, but a conversion from DC to ac, which is likely to be fairly inefficient, if as likely, it uses a switchmode power supply operating at 50Hz, rather than the higher frequencies used in most such supplies, so your total current from 12v DC will be somewhat more than 2.08A.