Electronics > Beginners
The energy cost of powering down vs deep sleep
(1/4) > >>
Peabody:
I'm working on a mailbox notifier using a Wemos (Lolin) D1 Mini powered by an 18650.  I have a two-transistor circuit that completely shuts down the battery supply to the circuit when the mailbox door is closed.  I figured - no matter how little current is used in sleeping, I could prevent that by shutting down the power altogether.

But then I saw a video by Andreas Spiess that basically concluded you should add a 1000µF aluminum electrolytic capacitor to Vcc to provide power for the current spikes that happen during wireless traffic.  That's fine, but it occurs to me that when I power down, whatever charge is stored in that capacitor is going to be lost - wasted - as it dissipates.  So if that capacitor is really needed, then I need to consider the power lost when I shut down versus the power that would be required to just put the Mini into deep sleep with the power still on.

I will be able to measure the sleep current, and it's my understanding that Minis don't sleep very soundly, so I'm looking at something like tens of microamps.  But I don't know how to calculate how many microamp seconds are in that capacitor.

There will be two power-up cycles per day - once when the carrier puts the mail in the box, and once when the box is emptied - I don't know, but I would guess maybe 5 seconds each time to make the WiFi connection and send the notice.  The other 23 hours, 59 minutes and 50 seconds of the day would be spent either powered down or sleeping.  If the sleep current is, say, 20µA, then thats 86,390 seconds per day at 20µA, or 1,727,800 microamp-seconds, which if my math is right is 0.48mAh drawn from the battery per day while sleeping.

What about the capacitor loss if I power down?  And by the way, I think the calculation needs to be based on charging to 4.2V because of the heat dissipated by the regulator while charging the capacitor to 3.3V.  The loss on discharge is really the energy it took to charge it, which would include that heat.

So hopefully there's a math major out there who can tell me the equivalent mAh's wasted by discharging the 1000µF capacitor twice a day.
mariush:
I don't think the capacitor is needed... an 18650 can give the current needed by the wireless.
Considering the application, you should consider having some solar cell on the side on the mailbox ... and slowly charge the Li battery  ... use a 5v ldo to regulate the solar cell voltage and feed that into some cheap charger IC like MCP73831 (~0.5$) that you can set to charge the battery.
Peabody:
The 18650 might provide enough current for the spikes, but it appears the regulator will not.  As shown in the Spiess video, the regulator just can't respond quickly enough to prevent a significant voltage drop.



Anyway, I'd still like to know how to calculate the mAh needed to charge a capacitor.
ledtester:
Q = C*V, so for C = 1000 uF and V = 3.3V, Q = 3.3 milli-couloub = 3.3 milliamp*sec = 3.3/3600 milliamp hours =~ 1/1000 milliamp hours.
james_s:
It shouldn't really matter, something like a Moteino can run for multiple years from a pair of AAs if you're careful with the code. Modern microcontrollers consume microamps in sleep mode.
Navigation
Message Index
Next page
There was an error while thanking
Thanking...

Go to full version
Powered by SMFPacks Advanced Attachments Uploader Mod