Author Topic: Why are modern pus polarized?  (Read 8224 times)

0 Members and 2 Guests are viewing this topic.

Offline james_s

  • Super Contributor
  • ***
  • Posts: 21611
  • Country: us
Re: Why are modern pus polarized?
« Reply #50 on: June 05, 2018, 08:39:45 pm »
Also if the current of a heater goes up, the heat output increases so the thermostat will turn it off sooner. As long as the power doesn't rise to a level that the heater can't tolerate, the consumer is not paying any more. Higher wattage for less time = the same kWh and BTUs.

The mains voltage has never been considered a precision value, equipment is all made to tolerate a reasonable range.
 

Offline tooki

  • Super Contributor
  • ***
  • Posts: 11457
  • Country: ch
Re: Why are modern pus polarized?
« Reply #51 on: June 05, 2018, 09:16:22 pm »
Not an error made in this thread, but while I’m ranting: in USA it’s nominal 120V. Not 110, not 115, not 117, not 125. No, it wasn’t just the tolerances that were changed from the old 110V, the actual voltage at the outlets was raised, too, many decades ago.


It has been raised in steps, although I don't recall precisely when. Originally we had 110/220, later 115/230, then up into the 1970s as I recall it was 117/234 and since then 120/240. Many people, especially older folks still refer to mains colloquially as "one-ten".
IIRC from reading about it, it went from 110 to 115 to 120. All the other values were never nominal voltages for the grid, but rather what was seen on device nameplates, which was sometimes the nominal value, other times a maximum or an average. (For example, how you see power cords rated at 125V (a maximum) or 117V (an average on an appliance, so that it appears close to both 115 and 120V.)
 

Offline apis

  • Super Contributor
  • ***
  • Posts: 1667
  • Country: se
  • Hobbyist
Re: Why are modern pus polarized?
« Reply #52 on: June 05, 2018, 10:12:39 pm »
Also if the current of a heater goes up, the heat output increases so the thermostat will turn it off sooner. As long as the power doesn't rise to a level that the heater can't tolerate, the consumer is not paying any more. Higher wattage for less time = the same kWh and BTUs.
I agree in the case of a regulated heater, but in the case of incandescent bulbs and capacitive droppers it would be detrimental. Incandescent bulbs not only draw more current they fail faster too. It's probably a safe bet that if you increase the voltage the the current will increase, by how much is hard to guess.
 

Offline Zero999

  • Super Contributor
  • ***
  • Posts: 19479
  • Country: gb
  • 0999
Re: Why are modern pus polarized?
« Reply #53 on: June 06, 2018, 01:08:38 pm »
Also if the current of a heater goes up, the heat output increases so the thermostat will turn it off sooner. As long as the power doesn't rise to a level that the heater can't tolerate, the consumer is not paying any more. Higher wattage for less time = the same kWh and BTUs.
I agree in the case of a regulated heater, but in the case of incandescent bulbs and capacitive droppers it would be detrimental. Incandescent bulbs not only draw more current they fail faster too. It's probably a safe bet that if you increase the voltage the the current will increase, by how much is hard to guess.
I doubt incandescent lamps are a large proportion of the load nowadays and will rapidly be replaced with alternatives or by ones rated to the new mains supply voltage, if that's not practical.

Decent quality LED lamps use a proper switched mode power supply, so increasing the voltage will reduce the current.

There's a device available in the UK, which is supposed to save energy, by regulating the mains voltage at 220V, using a buck transformer. In reality it makes little or no difference, for the reasons mentioned above.
« Last Edit: June 06, 2018, 01:38:16 pm by Hero999 »
 

Offline BeaminTopic starter

  • Super Contributor
  • ***
  • Posts: 1567
  • Country: us
  • If you think my Boobs are big you should see my ba
Re: Why are modern pus polarized?
« Reply #54 on: June 06, 2018, 03:50:42 pm »
They do learn a lot from each other, but there are differences which are impossible and others which are very expensive to remove.

In mountains Italy, Norway etc. getting a good "earth" is not possible, you may literally have to drill through hundreds of meter of rock and that is still no guarantee that you reach the same potential as the upstream transformer station.  (Failure to do so means lightningstrikes become *really* interesting.)

And needless to say, nobody wants to change all the installations in the country in order to match some fancy-pantsy stuff from BXL, and many countries already had many generations of local regulations live on the grid anyway.

The UK system with fused plugs has its merits, but originally it stems from there being only a single fuse for the entire household and a lot of houses catching fire because of it.

With regard to the "europlug" or "schuko" plug, yes, germany uses the side contacts for PE and france uses the hole.

The french system has a (mostly theoretical) advantage in that they can "force" PE connection in baths, kitches etc, because a plug without the hole cannot be inserted in the outlet.

Things gets even more weird when it gets to residual current protection, where you can either rely on PE (but not in Norway, Italy etc.) or RC breakers (everywhere) or if your regulators waffled a lot: Both, as here in Denmark.

The 220/230/240 VAC thing:  It used to be that everybody had +/-10% whatever their nominal voltage was.

The harmonized spec became 230VAC +6/-10% because that way it would not exceed the 220+10% limit.

It also meant that only the parts of the 220 grids already running low would need immediate adjustment.

It is not -6% because UK's grid was running as low as possible because of, well, basically "greed".

Adjusting the voltage is a trivial matter of turning the winding-selector on the 10KV/400V transformer.

The original goal was to later "upgrade" to 230+/-10% but I think that has been abandonned for ever.

So how would you get a ground when you are on a granite mountain? I always thought it was drill a hole and fill it with molten copper/metal, like kind of soldering to the rock. Is it the water and dissolved salts in the soil that makes it ground? I was always impressed that you could make one just by pounding a copper pipe 2' into the ground that actually worked. You don't think of dirt as part of your circuit. I know the worst radio propagation is over sandy desert and best is over the ocean. VLF transmitters are built over poor conductive soil to create a long earth path for the signal. How would you make a ground rod in sand dunes where there is no moisture? I imagine this must have been a nightmare for the radio people over in Iraq.
Max characters: 300; characters remaining: 191
Images in your signature must be no greater than 500x25 pixels
 

Offline Zero999

  • Super Contributor
  • ***
  • Posts: 19479
  • Country: gb
  • 0999
Re: Why are modern pus polarized?
« Reply #55 on: June 07, 2018, 08:05:44 am »
Digging down deep enough in the earth always leads to water with has some dissolved salts, making it conductive.

The earth doesn't normally form part of the circuit. One side of the mains is always connected to ground to stop it from floating at high voltages and provide a return path for lightning strikes on the cable.
 

Offline apis

  • Super Contributor
  • ***
  • Posts: 1667
  • Country: se
  • Hobbyist
Re: Why are modern pus polarized?
« Reply #56 on: June 07, 2018, 10:41:56 am »
Also if the current of a heater goes up, the heat output increases so the thermostat will turn it off sooner. As long as the power doesn't rise to a level that the heater can't tolerate, the consumer is not paying any more. Higher wattage for less time = the same kWh and BTUs.
I agree in the case of a regulated heater, but in the case of incandescent bulbs and capacitive droppers it would be detrimental. Incandescent bulbs not only draw more current they fail faster too. It's probably a safe bet that if you increase the voltage the the current will increase, by how much is hard to guess.
I doubt incandescent lamps are a large proportion of the load nowadays and will rapidly be replaced with alternatives or by ones rated to the new mains supply voltage, if that's not practical.

Decent quality LED lamps use a proper switched mode power supply, so increasing the voltage will reduce the current.

There's a device available in the UK, which is supposed to save energy, by regulating the mains voltage at 220V, using a buck transformer. In reality it makes little or no difference, for the reasons mentioned above.
If you go from 240 to 220 you'd save about 16% on resistive loads, so the losses in the buck converter would eat up what you gain by lowering the voltage, especially since only some of your appliances would be resistive.
It might still be a noticeable effect for the power companies if they change voltage.
« Last Edit: June 07, 2018, 11:16:15 am by apis »
 


Share me

Digg  Facebook  SlashDot  Delicious  Technorati  Twitter  Google  Yahoo
Smf