Can someone explain where the 52% derating comes from?

azrc
azrc Solar Expert Posts: 43
I see this number used a lot, but can't derive it using 70% efficient batteries, 97% efficient inverter and 90% efficient MPPT.

I know this is a conservative fudge factor, but wouldn't this number go way down if most heavy power usage was while the sun was up and night time mostly limited to lighting/electronics?

Comments

  • Cariboocoot
    Cariboocoot Banned Posts: 17,615 ✭✭✭
    Re: Can someone explain where the 52% derating comes from?

    It's a matter of context: in an off-grid system you can expect 50-60% of your solar panels' nameplate rating to be available as actual AC power. In other words, 1000 Watts of array is likely to provide 500 Watts * 4 hours (average sun) = 2 kW hours of power per day.

    The first derating (and the one you omitted) is the panels themselves; rarely will they ever put out full rating. Usually it's about 70-80% of the nameplate: 100 Watts of panel will produce 70-80 Watts over the "equivalent 4 hours of good sun".

    After that loss you have the loss from the charge controller, the loss of the inefficiency of batteries (takes 20% more power to "fill them up" than you'll have available for use afterwards), and the loss of the inverter. Not to mention the loss through wiring and the variable loss of temperature dissipation throughout the system.

    And yes you can make better use of the power by load shifting: when the batteries are near/at full the panels will only be producing what is needed to charge at that point. This will be well below their maximum capacity, so you have a loss of "harvest". If you switch on your big loads then, you can make better use of the array capacity: net charge to battery = power in minus power out. If you only need 'X' Watts to charge the battery, better to harvest any extra available solar power directly to loads.

    But the rule-of-thumb is: 50% of the panel rating for expected daily Watt hours available. It's better to err on the side of caution this way; the #1 mistake made in solar is over-estimating production and under-estimating loads.
  • BB.
    BB. Super Moderators, Administrators Posts: 33,431 admin
    Re: Can someone explain where the 52% derating comes from?

    From the PV Watts website:

    http://www.nrel.gov/rredc/pvwatts/changing_parameters.html#dc2ac

    The 77% first derating factor accounts for STC (marketing) vs more or less real life (PTC) rating for solar panels, wire losses, GT Inverter (or solar charge controller losses.

    For batteries, typically towards the end of their lives, the run ~80% efficiency for flooded cell and ~90% for AGM. Also, it depends on how you use cycle the batteries... If you cycle a bunch around 90-100%, there are more losses than if you cycle 50-85% or so...

    For a standard AC inverter, I use 85% efficiency. Very light loads and very heavy loads can be lower efficiencies (large inverter with light loads can be very inefficient. A large inverter may have 30-60 watts of idling losses without any loads attached).

    So, the "typical" system setup would be:
    • 0.77 * 0.80 * 0.85 = 0.52 end to end efficiency
    Now, if you use the solar panel power during the day--no 80% battery losses.

    If you use DC direct loads, no 85% inverter losses.

    Run the panels in a snowy climate in winter with reflected sunlight off snow/frozen lake--Better than 0.77 of rated panel power.

    Run/charge a battery at C/10-C/20 -- more efficient. Run at C/8 to C/2.5 or more current (Amp*Hour * C/XX = current).

    Run lots of equalization/high charging voltage--wasted power.

    Expect to operate your loads at 0.52 of Rated Power every day--Probably are not going to be happy. Some days you will have more sun, some days you will have less... Plan your initial system to run at ~50% to 75% or "average" sun + plan on using a genset during bad weather/periods of heavy load (and/or plan on turning off optional loads).

    Off-Grid power is expensive... Estimate $1-$2+ per kWH vs grid power around $0.10-$0.20+ per kWH. Not only installation + parts costs, but plan on replacing batteries every 4-8+ years (depending on quality, usage cycle, and maintenance). And replacing major components (good quality charge controllers, inverters, etc. every 10-15 years or so).

    -Bill
    Near San Francisco California: 3.5kWatt Grid Tied Solar power system+small backup genset
  • icarus
    icarus Solar Expert Posts: 5,436 ✭✭✭✭
    Re: Can someone explain where the 52% derating comes from?

    Please remember too, that the 52% is overly optimistic. One reason is that as the controller ramps down as the batteries become fuller you end up leaving potential power on the table. Unless you can manage your power very effectively such that your batteries come full just as the sun goes off the Pv it becomes hard to get even the 52%.

    I like to use the 50% rule of thumb times 4 hours. Even that is probably too generous day in and day out.

    Just for the record, we use ~ 6-800 wh/day, from 400 watts of panel. On a really good day we can generate up to ~ 1.5 kwh/day, but I like to use the average. (400/2*4=800 wh)

    Tony
  • niel
    niel Solar Expert Posts: 10,300 ✭✭✭✭
    Re: Can someone explain where the 52% derating comes from?

    azrc,
    as you said, it is an arbitrary number for fudge factoring and is changeable under specific systems. if one uses the best efficiencies in mind with equipment and design and, even as you said, used the power while the sun shines the fudge factor changes for the better. it could go far worse than bb cites too if one really does a bad job of things along with poor equipment choices. it is meant to give a rough idea of what will be yielded as one can't count on what the stc ratings are to be delivered.