How does battery voltage translate to watts stored? Does it?

s3w47m88
s3w47m88 Registered Users Posts: 7 ✭✭
edited February 2020 in Solar Beginners Corner #1
The Problem
I'm unclear about my readings. Specifically how much energy is stored in my batteries.

The Context
I'm new to solar, trying to power and office space off grid. But I successfully setup my Renogy kit and am operating.

The Setup
- 2 x Renogy 100 Watt Panels - will add more.
- 2 x Titan T105 6V batteris in series. 225Ah @ 20hr or 185@5
- 20amp Renogy Rover Elite 20A MPPT Solar Charge Controller
- Renogy 3000 Watt Inverter.
- Currently, I only have some LED lights plugged in.

The Confusion
- I want to have a real time reading of how much energy, in watts, is stored in my batteries.
- I want to measure how much is being drawn in real time.
- How much is being produced in each hour.
- How much is being produced each day.

I bought the Renogy solar controller get those readings. But I'm unclear about a few fundamentals.

My understanding is that a fully charged battery has 12.8 volts. And empty battery has 11.2. Going below that damages the battery. I can get this reading from the controller but what I don't understand is the equation and factors to translate that to KwH. The controller does tell me KwH but I don't really know what that is relative to.

For example, my reading is .8KwH. So is that 800 watts is stored? Or 800 x 24 hours?

Thank you!

Comments

  • Marc Kurth
    Marc Kurth Solar Expert Posts: 1,156 ✭✭✭✭
    edited February 2020 #2
    Battery voltage does not directly indicate watt-hours stored.
    "Standing" voltage measured at least 4 hours after charging or discharging can give you an idea of the battery charge level. BUT that is relative to its current condition. An old battery may show a "full voltage" but may have only 10% of its specified capacity. A new battery will show its "full voltage" and have 100% of its rated storage capacity.
    A sulfated battery will most often read a "lower than full" voltage because it has already absorbed all of the energy that it can. Charging it longer will not increase its storage capacity, because it is as full as it is going to get.
    The exact voltages for 100%, 50% and 0% are very specific to battery design and specific brand.
    When operating under load, battery voltage will sag. The bigger the load, the lower the voltage will sag. When you turn off the load, the battery bank will rebound. Reputable manufacturers provide curves or tables to show you this relationship of load vs. voltage vs. % capacity of a NEW battery.
    If you think about it, this can help serve as a "sanity check" of battery bank health - better than the standing voltage.
    Marc
    I always have more questions than answers. That's the nature of life.
  • s3w47m88
    s3w47m88 Registered Users Posts: 7 ✭✭
    edited February 2020 #3
    Hi Marc,

    Thanks for taking the time to reply. I think that helped me a little.

    I did understand volts didn't directly translate to watts so I think I'm still struggling with the relationship.

    I understand v x a = w. And Kw is 1,000 w. And 1kWh is 1,000 watts for 1 hour.

    But on my MPPT it says .8kwh when I cycle to the battery kwh reading. But I guess I don't fully grasp what that is telling me.

    Here's a little more detail that might show where my brain is stalling:

    If I had a hypothetical light bulb that was 15w. And it's plugged into an outlet that draws 7.5 volts at 2 amps. That would make 15 watts per hour, correct?

    So then a 12.8v standing charge on a battery, which is rated at 100% when at 12.8v, would be losing 15 volts every hour?

    But obviously that math is wrong because I would destroy my battery in less than 1 hour.
  • Marc Kurth
    Marc Kurth Solar Expert Posts: 1,156 ✭✭✭✭
    edited February 2020 #4
    s3w47m88 said:

    If I had a hypothetical light bulb that was 15w. And it's plugged into an outlet that draws 7.5 volts at 2 amps. That would make 15 watts per hour, correct?

    So then a 12.8v standing charge on a battery, which is rated at 100% when at 12.8v, would be losing 15 volts every hour?

    Read this part of your post again. You said 15 watts. (That is per hour, expressed as 15 watt-hours if you run one hour)
    In the next sentence, you jumped to 15 volts per hour instead of watts.

    You have a 12v battery bank with a storage capacity of 225 amp-hours.
    So, 12v x 225 amps = 2,700 watt hours of storage.

    How long will your light bulb run until your battery bank reached zero?
    I always have more questions than answers. That's the nature of life.
  • BB.
    BB. Super Moderators, Administrators Posts: 33,599 admin
    Adding to what Marc says... For a first estimate, Lead Acid Batteries are fairly close to 100% "amp*hour" efficient... If you have a 100 AH battery and pull 50 Amp*hours out of it (assuming good battery, normal room temperature, etc.), you will have used ~50% of the Amp*Hour capacity.

    However, Power (Watts) and Energy (Watt*Hours) are both on Voltage, current, and time...
    • Amp*Hours = Amps * Hours (time)
    • Watts = Amps * Volts [power]
    • Watt*Hours = Amps * Volts * Hours (time) = Watts * Hours
    So, if your battery starts out at 12.8 volts full, and goes to 10.5 volts Dead (not good thing to do), The voltage * current (measured every few minutes, gives you Watts*Hours.

    Just for an easy example. Say your load is 1/20th (5%) of the battery's Amp*Hour capacity (a standard test condition--And works well for Off Grid calculations). Say starting voltage is 12.5 Volts under load, and ending is 11.5 Volts at 50% of battery capacity. And you are using (actual numbers will be different, but close enough for our estimates here):
    • 100 AH @ 12 volt battery
    • 100 AH * 5% rate of discharge (1/20 hour rate) = 5 amp draw.
    • Say battery discharges evenly from 12.5 to 11.5 volts... Average would be 12 volts. And this goes for 10 Hours (1/2 of 20 hour discharge capacity)
    • 12 volts * 50 AH = 600 Watt*Hours of energy used
    Now if you were to draw at 1/10 capacity, the "apparent capacity of the battery may be ~80 AH... So 50% discharge would be something like:
    • 80 AH capacity * 10% discharge rate = 8 amps
    • 5 hour * 8 amp discharge  * 12.25 battery average discharge voltage = 490 Watt*Hours to 50% capacity (just a rough estimate for example).
    There are battery monitor systems that are designed to give you a better estimate of state of charge/available AH/WH from the battery bank (not cheap, and are not perfect):

    https://www.solar-electric.com/bogart-engineering-tm-2030-a-battery-monitor.html (less expensive, maybe less user friendly setup)
    https://www.solar-electric.com/victron-energy-bmv-712-smart-battery-monitor.html (very nice monitor, with Bluetooth/alarms/etc.)

    And there are some very interesting and cheap monitors these days (not sure I would call them battery monitors, but they do measure you loads). True battery monitor systems take time, temperature, battery capacity, battery efficiency, etc. into account... These AH/WH meters do not (in general?):

    https://www.amazon.com/s?k=dc+ah+wh+meters&ref=nb_sb_noss

    Seeing your second post... "K" is just a 1,000x Factor... 1 kAmp = 1,000 Amps, 1kAH = 1,000 AH, 1 kW = 1,000 kW etc. 

    Our on grid homes use so much energy, the utility charges in kWH... An example for a medium size off grid power system:
    • 3,300 WH per day = 3.3 kWH per day
    • 3,300 WH per day * 20 Days = 99,000 WH per month
    • 3.3 kWH per day * 30 Days = 99 kWH per month 
    Amps and Watts are both "rates"... Amps is like Gallons per Seconds (or hour) (an amount). Watts is like Gallons per second (or hour) pumped at 100 psi (power).

    So, A 12 Watt lamp running for 2 hours: 
    • P=V*I (power or "rate")
    • I=P/V= 12 volts / 12 volts = 1 Amp
    • Amp*Hour = 1 Amp * 2 hours = 2 Amp*Hours (assuming 12 volts)--A "rate", but does not include voltage, so is not "power".
    • Power = Volts * Amps = 12 volts * 1 Amp = 12 Watts (power, or "rate" of energy usage)
    • Energy = Volts * Amps * Time = Watts * Time = 12 volts * 1 Amp * 2 hours = 24 Watt*hour of consumption.
    • 1 amp * 2 hours = 2 hours from 12 volt battery
    • 2 Amp*Hours / 100 AH capacity (at 20 hour rate) = 0.02 = 2% of battery AH capacity used (or 98% left)
    For solar power systems, because we are usually using a mix of voltages (like 12 volt battery bus, and 120 VAC power from AC inverter)--We tend toi use Watts/Watt*Hours for our power/energy measurements to avoid confusion... Watts/WH are a "complete unit" Amps/AH are a partial unit (does not have Volts).

    For things like a 12 volt Boat were all the loads (and generation) are done at 12 volts... Amps and Amp*Hours works great because everything is assumed to be at 12 VDC... 100 AH battery, draw 2 amps * 4 hours = 8 AH gives:
    • 100 AH - 8 AH load = 92 AH left in battery (suggest to not draw down below ~50AH or 1/2 of capacity for longer battery life, and recharge at soon a pracible.
    For us, say you have a 12 Watt lamp... Is that 12 Watts at 12 volts or is that 12 Watts at 120 Volts (AC):
    • P=V*I
    • I=P/V= 12 Watts / 12 volts (DC battery bus) = 1 amp @ 12 VDC
    • I=P/V= 12 Watts / 120 volts (AC inverter output) = 0.1 Amps @ 120 VAC
    You can see, that depending on how you power that 12 Watt lamp (12 VDC or 120 VAC), the amperage draw is a factor of 10x different.

    But when we simply use 12 Watts, that is an easy number and we don't have to keep track of 12 VDC bus or 120 VAC house wiring... The "power/energy usage" is the same either way.

    Not that you can use 1,000 WH in 1 hour, 10 hours or 10 minutes, the Average Load (in Watts) will be different:
    • 1,000 WH / 1 hour = 1,000 Watt average load (for 1 hour)
    • 1,000 WH / 10 hour = 100 Watt average load (for 10 hours)
    • 1,000 WH / 0.1 hours (6 minutes out of 60 minute hour) = 10,000 Watt average load (for 6 minutes)
    Note that draws of "non-standard voltages" such as 12 Watts at 7.5 Volts.. It does depend on how you make that 7.5 volts... Use a resistor:
    • 12 watts / 7.5 volts = 1.6 Amp draw (at 12 volt)
    • 1.6 amps * 12 volts = 19.2 Watt draw (extra power is lost as heat in resistor)
    • 12 Watts / 19.2 Watts = 0.625 = 62.5% efficient conversion
    But if you use a switch mode ballast (very common these days), then the usage would be:
    • 12 volts * 1/0.95 switch mode ballast = 12.6 Watts used (ballast would lose about 5% of energy to heat).
    You are asking lots of good questions, but it can be confusing to try and answer "everything" in one post, and not make things more confusing.

    I suggest that you give us a "problem" that you want to solve, and we go through the design step by step (and probably several posts).

    Also, many times we give answers that are "close enough" for discussion... Solar power, an answer that is within 10% of the "right answer" is usually close enough for our needs.

    -Bill
    Near San Francisco California: 3.5kWatt Grid Tied Solar power system+small backup genset
  • s3w47m88
    s3w47m88 Registered Users Posts: 7 ✭✭
    edited February 2020 #6
    Ultimately, I'm just trying to comprehend the equations necessary to calculate my build. Everything from what's generated, to what's stored, to what's drawn. I've answered many of those questions, but the holes in my knowledge are, obviously, fundamental. So I am neither certain my understanding of the equations are correct, nor that my totals from those equations are correct.

    For simplicity, let's assume I only wanted to power the following, and these ratings were at a constant:

    - 138w (120vac @ 1.15a) Sony Bravia XBR X750D TV
    - 230w (19.5vac @ 11.8a) 2019 Razer Blade RZ09-0301 Laptop

    Question #1: Does this mean I'm drawing 368 watts per hour, 12.95 amps per hour, and 139.5 volts AC per hour?

    If yes

    Question #2: If I have two 6 vdc batteries rated at 225Ah @ 20 hours, connected in series to create 120vac (approximately), then how does that provide 139.5 volts per hour?

    Question #3: Do I subtract 12.95 from 225? And continue this until Ah is reduced to 0. Then what do I do? Switch to 12v batteries and put them in parallel so my Ah can add up infinitely with each new battery?
  • s3w47m88
    s3w47m88 Registered Users Posts: 7 ✭✭
    edited February 2020 #7
    I understand you probably copied and pasted what you already wrote to highlight your response as an answer. But I don't see how this answers any of my questions from my previous post. Namely, Q1. Forgive me if my ignorance is blinding me here.
  • BB.
    BB. Super Moderators, Administrators Posts: 33,599 admin
    OK, back from walk...
    For simplicity, let's assume I only wanted to power the following, and these ratings were at a constant:

    - 138w (120vac @ 1.15a) Sony Bravia XBR X750D TV
    - 230w (19.5vac @ 11.8a) 2019 Razer Blade RZ09-0301 Laptop

    Question #1: Does this mean I'm drawing 368 watts per hour, 12.95 amps per hour, and 139.5 volts AC per hour?
    Watts is a rate... Like Miles per Hour.... Watts per hour would be Miles per hour^2 (technically, acceleration).

    It would be the same as if we named Miles per Hour the "Ford" unit... I.e., 48 mph would be 48 "Fords". Just a "toss" to honor Scottish inventor James Watt.

    https://en.wikipedia.org/wiki/Watt
    The watt (symbol: W) is a unit of power. In the International System of Units (SI) it is defined as a derived unit of 1 joule per second,[1] and is used to quantify the rate of energy transfer. In SI base units, the watt is described as {\displaystyle kg\,m^{2}s^{-3}}displaystyle kgm2s-3, which can be demonstrated to coherent by dimensional analysis.[2]
    So Watt is a unit of work (Joule) per second...

    And even though it is Joules/Second... We (general electrical power systems) use  "per hour" as "per second" small amount of energy... For example, Watt*Seconds (the SI units):
    • 3,300 WH per day * 24 hours per day * 60 minutes per hour * 60 seconds per minute = 285,120,000 Watt Seconds per day (running a very energy efficient off grid home for 1 day)
    Years ago, when I first posted here, I fell into the the Watts per Hour Units (that I would have to edit later to Watts)... And I am (more or less) an Electrical Engineer (systems engineer).

    Also, there is no reason to use Volts per Hour--Does not mean anything useful in off grid power.

    So, both running, the math I would use:
    • 138 Watts
    • 230 Watts * 1/0.90 to convert 19.5 volts to either 12 VDC or 120 VAC) = 256 Watts
    • Total = 394 Watts total
    • 394 Watts * 1/120 VAC (assuming you are using 120 VAC @ 60 Hz) = 3.83 Amps @ 120 VAC
    • 394 Watts * 1/0.85 AC inverter eff * 1/12 VDC = 38.6 Amps @ 12 VDC --- This is your battery bus current
    Remember that Watts is a rate (like miles per hour). And we have a "voltage" conversion (assuming you do not have a 19.5 VDC battery bus).

    Now... That above is a "rate" of energy usage. Now, the next question is what is total amount of energy used per day (like 60 mph times 8 hours of driving per day = 480 miles driven per day).

    So lets say you want to run the Monitor and Laptop 5 hours per day... Then the amount of energy consumed would be:
    • 394 Watts * 5 hours per day = 1,970 WH per day (on the 120 VAC side of the power system)
    • 394 Watts * 5 hours per day * 1/0.85 AC inverter eff = 2,318 WH per day @ 12 volt battery bus
    • 2,318 WH per day (DC) * 1/12 volt battery bus = 193 AH per day (assuming 12 volt battery bus running at an average of 12.0 volts)
    Note that the current draw varies depending on the battery bus/DC input voltage... If you are charging the battery during the day (say 14.5 volts) vs running at 10.5 volts (low battery voltage, and ~1 volt drop through DC bus wiring).
    • 394 Watts * 1/0.85 AC inverter eff * 1/14.5 volts (charging) = 32 Amps on DC bus (high bus 
    • 394 Watts * 1/0.85 AC inverter eff * 1/10.5 volts (inverter batt cutoff voltage) = 44 Amps on DC Bus (low bus voltage)
    You can see there is significant differences in current draw from battery bus... So, getting "really exact" is pretty painful. Use 12.0 volts as a "low average" voltage and call it a day.
    Question #2: If I have two 6 vdc batteries rated at 225Ah @ 20 hours, connected in series to create 120vac (approximately), then how does that provide 139.5 volts per hour?
    Again, Volts per Hour does not mean anything here... And if you are drawing 193 AH @ 12 volts (nominal)... And you have 2x 6 volt @ 225 AH batteries in series for a 12 volt @ 225 AH battery bank. At 5 hours, this system would nomally take 193 AH from a 225 AH battery bank--So, in 5 hours you would take the battery bank to near dead (not a good thing).

    And then there is the issue that Battery Bank is rated at C/20 hour discharge rate, and here you are looking at (roughly) C/5 discharge rate. Here is an example of the Trojan T105 FLA battery, C/20=225 AH capacity. C/5 discharge rate = 185 AH capacity.

    https://www.trojanbattery.com/product/t-105/

    At that apparent size, a 2x golf cart battery bank would not even support your system for 5 hours (for an off grid system.

    This is were I get on my soapbox and ask you what is your planned hours of day runtime? Is this weekend/summer, or full time off grid?

    Looking for a monitor that uses something like 1/2 to 1/4 the Watts (rate of energy usage) would be a suggestion.

    Also, the laptop--Typically see something like 30-60 Watts average usage...

    And use a Watt*Hour meter to measure your actual power/energy usage... Using the nameplate ratings tends to overestimate energy usage (not always). For example of some meters:

    https://www.amazon.com/s?k=kill-a-watt+meter&ref=nb_sb_noss (typical Kill-a-Watt and similar 120 VAC plug in appliance meters)
    https://www.amazon.com/s?k=DC+wH+ah+meter&ref=nb_sb_noss (typical DC AH/WH meters)

    You really need to find the most energy efficient devices you can (almost always cheaper to conserve energy than to generate energy), and to understand your loads as you use them (solar/battery power is expensive, oversizing a system is costly, undersizing is spending money for something that does not work).

    -Bill
    Near San Francisco California: 3.5kWatt Grid Tied Solar power system+small backup genset
  • s3w47m88
    s3w47m88 Registered Users Posts: 7 ✭✭
    edited February 2020 #9
    So in summary are you saying:

    1. Load Amps x Load Volts = Load Watts
    2. Load Watts x Hours Run Per Day = Load Watt Hours
    3. Load Watt Hours ÷ Battery Voltage = Load Amps Hours
    4. Battery Amp Hours Rating - Load Amp Hours = Battery Amp Hours Still Available

    *Ignoring all efficiency decreases for the sake of brevity.
  • BB.
    BB. Super Moderators, Administrators Posts: 33,599 admin
    edited February 2020 #10
    Yep... Here is a good article/chart on battery voltage vs various loads/charging vs state of charge:

    https://www.scubaengineer.com/documents/lead_acid_battery_charging_graphs.pdf

    -Bill
    Near San Francisco California: 3.5kWatt Grid Tied Solar power system+small backup genset
  • mike95490
    mike95490 Solar Expert Posts: 9,583 ✭✭✭✭✭
    I prefer to convert EVERYTHING into watts and watt hours
     Battery has 44,000 watt hours in it (volts x ah)  (This is usually a 20hr rate, not 24 hr, lame disconnect of measurement conventions)
    Loads consume 8900watt hours in 24 hour period
    Solar can harvest 18,000 watt hours in a sunny summer day, 6,000 wh on a winter day

    Then you can easily massage the numbers around and see whats going to work. (without having to convert 12V 36 ah to 120V 43wh)
    Powerfab top of pole PV mount | Listeroid 6/1 w/st5 gen head | XW6048 inverter/chgr | Iota 48V/15A charger | Morningstar 60A MPPT | 48V, 800A NiFe Battery (in series)| 15, Evergreen 205w "12V" PV array on pole | Midnight ePanel | Grundfos 10 SO5-9 with 3 wire Franklin Electric motor (1/2hp 240V 1ph ) on a timer for 3 hr noontime run - Runs off PV ||
    || Midnight Classic 200 | 10, Evergreen 200w in a 160VOC array ||
    || VEC1093 12V Charger | Maha C401 aa/aaa Charger | SureSine | Sunsaver MPPT 15A

    solar: http://tinyurl.com/LMR-Solar
    gen: http://tinyurl.com/LMR-Lister ,