Power Factor What Does It Mean

That is a very detailed response that I can only presume is accurate.

The device I'm powering is a sonar unit on a boat. It is a 12v device and rated at 9W RMS. I'm presuming that means it draws about .75 amps, is all? What I'm really trying to understand is how long a battery of a certain size will run the sonar unit. I recently tested the LiFePO4 battery I'm intending to use and it's about 13.0 volts and 18 amp-hours. My limited understanding is that would be 234 watt-hours. Divide that by 9W (RMS?) and it should run the sonar unit for about 26 hours?
RMS and Power factor are for AC power. There is no such thing as RMS Watts for DC. When converting units it is a good idea to include all of the units in the calculation and cancel out units as you multiply or divide by other units. V/V, A/A or W/W cancel that unit out. If you don't end up with the units you expect to see at the end of the process, then you likely have an error in your calculations.

Regarding powering from 13V instead of 12V, unless the device you are powering uses switching power supplies then powering a device with higher voltage than required just wastes the excess power (it gets converted into heat in the voltage regulators inside the device). So while the battery may be supplying 13V, the load is still drawing the same amps as it would from 12V. If it is a purely resistive load (like an incandescent lamp), then amp draw increases at higher voltage (which means the higher voltage battery won't last longer, but the bulb will be brighter). If the device uses switching power supplies it will draw less amps at higher voltages, but typical switching power supply efficiencies are usually between 75% to 80% (you get about a 3/4 reduction in current draw for higher voltages than the straight W = VA relationship would imply). I tend to ignore this unless operating at a much higher voltage (like powering a 12V device from 24V)

9 W = 9 VA/W = 9VA

So first you need to know how many amps the device draws. Divide the VA / V leaving A.

9 VA / 12V = 0.75A.

Next divide the AH rating of the battery by the draw of the load in amps, leaving H.

18 AH / 0.75A = 24 H.

To maintain a long battery life, I will normally only plan on using 80% of the batteries capacity. So I would derate the battery life by 80%. You can use 100% of battery capacity, but it will last more than twice as long (number of charge/discharge cycles) if you derate it. Considering how expensive lithium batteries are, derating to 80% is normally prudent.

24H x 0.8 = 19.2H.

Additional information: Lithium batteries really don't like being kept fully charged or fully discharged. The ideal state of charge (SOC) for storage is about 30%. What you don't want to do it charge a battery up to 100% then put it on the shelf, or worse put a float charger on it. Just to clarify terms. Batteries useful life is defined as the number of charge/discharge cycles it takes to reduce the capacity of single charge to 50% of the rated capacity. Depending on how you treat the battery this might be 800 cycles, or 2000 cycles. Or if you really screw up you may only get one cycle.

Things that will extend a battery's usable life:
* Recharging when the battery SOC drops below 20%
* Disconnect the charger from the battery when the battery SOC exceeds 95%.
* Using the battery at temperatures below 100 degrees F. It is ok to discharge a lithium battery below freezing
* Storying the battery at temperatures below 100 degrees F. The colder the better

Things that will shorten a battery's usable life:
* Keeping the battery fully charged for prolonged periods of time. Float charging a battery is really not good for it.
* Keeping the battery fully discharged for prolonged periods of time.
* Charging the battery at high temperature (above 100 degrees F)
* Storing the battery at high temperature (above 100 degrees F)

Things that can kill a battery (like right now):
* Discharging the battery below 0% SOC
* Charging the battery above 100% SOC
* Charging the battery below freezing temperature
Last edited:
The phrase rms Watts is a misnomer since Watts when referring to AC is ALWAYS rms, anything other than this is simply someone trying to sell something. The problems is unscrupulous companies have used the term "watts" in wildly incorrect ways when overstating the capabilities of devices they are trying to sell. This misuse of the term is rampant in the marketing of lower quality audio and power equipment. The reason why the term Watts rms became common was because people who were not lying about power were trying to say "No really, this is the real watts."

For AC the V and A are divided by the square root of 2. That is all rms means.

electrical power formula

The subject becomes more complicated as soon as you add reactive components (inductor or capacitors). But that is probably a lot deeper than this discussion warrants. Read here if interested in more.

Last edited: