Power Factor

mutantmagnet

Limp Gawd
Joined
Mar 13, 2009
Messages
267
I'm learning more and more about UPS each month but I need a little help and possibly reassurance on what I'm understanding.

While reading this I came across some info on power factors in part 2.

The National Electric Code limits the continuous current drawn through the equipment line cord to 80% of the rating of the receptacle. For the standard 15 A receptacle (NEMA 5-15R) the limit is 0.8 x 15 = 12 Amps

So if I'm reading this right, if I had 2000VA UPS it wouldn't work in a home at its capabilities because there's a cap at 1440 VA?

Does this cap on the receptacle affect only that receptacle or will it also affect the adjacent outlet as well?

I read at another location that electrical companies measure usage differently to the point that if you were to use 900 watts on a 1500 VA UPS would only pay for that 900 and not the leftover 600 because it's not drawn in by the user but loaded up by the power company. (excuse me if I'm saying this wrong)

Even if that is true doesn't energy get wasted by increasing the load requirements at the utility company?

If it does create waste why do so the overwhleming amount of 120V input compliant UPS have such a terrible Power Factor?
 
Hi,

While I'm not an expert by any means, I think that the reason is because creating something with such a high power factor may be more expensive. And yes, I also believe that it causes more strain on the grid. But UPS work different though, If I understand correctly, a UPS will charge its batteries as fast as it's designed to using as much VA as it can (VA is the actual power consumption the power company sees, watts is what is actually used by your ups, so VA * Power Factor = Watts). After the power goes out, the UPS will proved you computer with as much VA as it demands.

Please someone correct me if I'm wrong.
 
A 15A circuit breaker will flip at any current higher than 15A. That corresponds to an apparent power of 15A x 120V = 1800VA. That is the maximum apparent power a load can take from that wire if it is the only active load on that wire.



You are subtracting the real power from the apparent power and consider that to be the waste. That is not!
The waste the electric company has to deal with (and the consumer does not pay for) is the waste due to the resistance of the wires that bring power to your house. So, that is why the electric company likes to minimize the current through those wires. Bringing the power factor of the consumers as close as possible to 100% minimizes that current.


If a UPS advertises a 450Watts/750VA output power capacity, do NOT assume those to be its input figures.
 
A 15A circuit breaker will flip at any current higher than 15A. That corresponds to an apparent power of 15A x 120V = 1800VA. That is the maximum apparent power a load can take from that wire if it is the only active load on that wire.
Thank you, this additional insight helped clarify a thought I had on this.


You are subtracting the real power from the apparent power and consider that to be the waste. That is not!
The waste the electric company has to deal with (and the consumer does not pay for) is the waste due to the resistance of the wires that bring power to your house. So, that is why the electric company likes to minimize the current through those wires. Bringing the power factor of the consumers as close as possible to 100% minimizes that current.


If a UPS advertises a 450Watts/750VA output power capacity, do NOT assume those to be its input figures.

Ok this seems to answer why we as home owners shouldn't have to pay for the resistance problems when it comes to delivering energy to our home.

Initially my concern was that with our UPS having such a low power factor if every home used a UPS we would create greater energy demands than if we never used them in the first place. Reading around some more I'm getting the idea that the energy inefficiencies aren't related to their power factor other aspects of using the UPS. (If you are interested on some more insight on this this is good powerpoint example to start with)

Maybe I'm wrong because nothing I've read explains sufficiently why power factor shouldn't be an environmental concern.
 
Back
Top