Hey all,
I've got an HMI light (one of those ones that runs off a ballast) and I'm trying to do some basic math but it's not coming out right. I've plugged the device into a Kill-a-Watt to measure the amperage and wattage. Here's my issue:
The light is rated for 1200 watts. According to the Kill-a-watt, I'm actually drawing 1040 at approximately 12.6 amps. I took my measured amperage times the voltage and got a wattage of 1,512 watts (12.6*120). Why the discrepancy? I then wanted to understand the power factor of the unit, but the math for that doesn't work out at the 1,040 watts measured. To get my power factor I have to take the rated wattage and divide it by the actual draw (1200/1040). This doesn't work out, it gives me a power factor of 1.15 which is impossible. Now, if I divide by the wattage the math gives me it works (1200/1512 = .79). I should also note that if divide measured wattage by voltage (1,040/120) I get around 8.67 amps. So which is it? Am I running at 8.67 amps or 12.6?
Do these question make sense? I suppose the TL;DR version is: why does my meter read 1,040 watts at 120 volts and 12.5 amps, when the math tells me that I should be reading 1,500 watts at 120 volts and 12.5 amps?
Thank you!