Electronics > Beginners
ABC News posted a story re: Jacked Mains Voltage is increasing power bills
BradC:
--- Quote from: optoisolated on November 08, 2018, 04:56:22 am ---Do people still use incandescent bulbs these days?
--- End quote ---
Yep. Every lamp we have that is on a dimmer is incandescent. I also use halogens for work-lights, reading lights and floods.
We do have a lot of LEDs, but there are still plenty of applications where I prefer the halogen for colour rendition, spread and/or penetration and the fact they actually generate heat.
Oddly enough, when our power spikes super high we tend to lose more electronically ballasted lamps than incandescents. I've actually had 2 GU10 LED lamps spew flames. Thankfully I was actually in the room at the time and could deal with it before it got out of hand. They weren't E-bay cheapies either at $30/ea from a local electronics shop.
So yeah, I have a lot of incandescent.
sibeen:
Yeah, $1,200 more because the voltage is a tad high – ROFL, and I bet he’s got something to sell. It's a bullshit article
A lot of this comes back to Australia deciding to play nice with others and standardise on the IEC voltage of 230/400 volts. Generally states in Australia produced voltage at 240/415 volts, the UK, NZ and others are in the same boat, there were countries in the IEC world who produced voltage at 230/400 and others who produced at 220/380 volts. They all got together and decided to ‘normalise’ the standard voltage to 230/400.
So our standards in Oz changed. The voltage didn’t, it has stayed a nice healthy 240 volts or thereabouts :) From memory the old standard was something like 240V plus/minus 10%. So the range was 216 to 264. With the new standard they tightened it up to 230 + 10% – 6%, so the range is now 216 at the low end to 253 at the high. The one thing the article does have correct is what a pain in the arse solar production is for the energy providers. The network really wasn’t designed for it and the solar causes voltage rises onto the grid.
james_s:
I have to wonder how much electricity costs and how much he's using in order to get an extra $1200 a year. $1200 is significantly more than I spend on electricity a year in total!
That wide variation is a bit surprising though, I've monitored one side of my panel from time to time and I don't think I've ever seen it lower than 118V or higher than 122V and usually it's pretty much bang on within 0.5V of 120V so ~236-244V across the whole service.
Berni:
Dang, how much is his power bill then? If we assume the 10% higher voltage makes him assume 10% higher cost then he must be spending $12 000 per year on power. What the heck is he running? A bitcoin mining farm? :o
At least he is getting proper voltage. Due to living in a more rural area the grid here is pretty old here and yet power consumption started going up rapidly. A lot of it is due to heatpumps becoming popular. Because of the unstable power i set up a multimeter to monitor one of the phases and log it. What i found was that at times the voltage dipped as low as 180V or being as high as 240V at times. Sending the graphs to the electrical company we finally managed to get them to run a new low voltage cable for this part of the village. Now the voltage tends to be on the high side at 240V but we are the one of the first houses on that new line, but its stable there with no significant dips.
The reason we pressured them into it is that dad put up a 10kW solar installation on the house and the inverter was really not happy with the voltage dipping to 180V and locking itself into a safe mode.
vk6zgo:
--- Quote from: BradC on November 08, 2018, 03:37:28 am ---
--- Quote from: optoisolated on November 08, 2018, 03:22:24 am ---My main query with this article is how much impact does the voltage actually have on the cost of the energy supplied? The article simplifies it too much IMO. Does this come down to one of those Power Factor discussions?
--- End quote ---
I think they are mainly referring to ohmic appliances (stove, oven, hot water heater, incadescent lamp, radiator... so on) whereby an increased potential directly results in an increased power use. I have difficulty reconciling the magnitude of the increase in billable power they are referring to however as most of those referenced devices have thermostats, so they'd just run at a lower duty cycle.
--- Quote from: optoisolated on November 08, 2018, 03:22:24 am ---My understanding was that even if the voltage increased, resistance on especially passive appliances remains constant, therefore current would drop.
--- End quote ---
V=IR. or R = V/I. If R is constant and you increase V, I has to increase. P=VI, so if you increase V and I increases as a result then your power must increase and it's power you are billed for.
--- End quote ---
No it's power over time (kW/h), so if the duty cycle is reduced, less power is used in that time period.
Navigation
[0] Message Index
[#] Next page
[*] Previous page
Go to full version