Optimal current

Options
CALLD
CALLD Solar Expert Posts: 230 ✭✭
Question for those who know the best:

I've just installed a handy little toggle switch that allows me to seamlessly change from inverter power to utility power whenever I choose. No longer do I need to worry about trying to plan loads around available solar power as I can just switch over to utility if it becomes a little cloudy. This also allows me to optimize loads to the inverters maximum efficiency point which is between 500 and 1500 watts. It also allows me to remove all loads during bulk charging so the batteries get all available power.

Question is how much current do batteries need to charge optimally? Is it the more the better? Why then? Is it merely to get the sulfate off the plates quicker? I've noticed that my batteries will still reach absorb voltage even if the available charging current is only 10 amps and not the recommended minimum of 26 amps (260Ah AGM). It just takes them longer. Charging at a higher current seems to make absorb take longer as they will be at a lower SOC when they hit that voltage (higher current drives up the voltage quicker).

Another question is does a shallower cycling depth (say 25% DOD) permit lower charging currents? We know that lower charging currents are more energy efficient. The other reason for desiring a lower charging current is for running loads like refrigerators during the day. 2 refrigerators consume 90 watts each, misc loads about 35 watts. Total daytime load would thus be 215 watts. Inverter draw would be around 10 amps. Peak incoming current from PV would be around 28 amps. Would charging current be sufficient to keep batteries healthy?

Comments

  • Cariboocoot
    Cariboocoot Banned Posts: 17,615 ✭✭✭
    Options
    Re: Optimal current

    Problem with the first paragraph: "handy little toggle switch". Are we talking about switching the power source for 120 VAC loads? If so it had better be more substantial than a toggle switch, which usually implies something with a fairly low current capacity. 1500 Watts is 13 Amps so it would have to be able to manage more than that. Like 20 Amps to be on the safe side.

    Optimum battery charging current is an issue. The 10% rule-of-thumb works for most FLA's and gives a reasonable return. You can charge them higher, especially AGM's. But there is also an potential problem with too much current.

    Contrary to what some people think batteries do not self-limit charge current at a level that is safe for them. Oh there's an upper maximum to be sure, but that can be high enough to cause thermal runaway. Not good. Even levels below catastrophic can damage the battery if they cause too much heating. Maybe not the first time, but repeated high current levels will shorten battery life.

    Then there is the matter of charge acceptance, which is what I think you were aiming for. Yes, there is such a thing. The higher the current goes the more the ration between what amount of current is actually adding to charge and what amount is heating the battery changes. It gets worse, of course, as the current goes higher. How much? Depends on the particular battery.

    Curiously I was experimenting with exactly this phenomenon last Summer. What I found was that lower current at lower SOC improves charge acceptance. In fact increasing the current as SOC improved to the switchover to Absorb Voltage was most effective (but an absolute pain to do). Unfortunately this is completely backwards to what solar applications need because of the finite amount of charge time available from sunshine.

    Too low a charge rate also has problems. As in nothing happens or nothing happens in a reasonable amount of time. This is why battery makers tend to recommend a 5% of 20 hour capacity minimum rate. Charging off mains and charging different kinds of batteries - all come up with different recommendations, when you can get any.

    So with solar charging you have the limited window of opportunity. You have a minimum charge rate needed to achieve any effective charging during that time. That number would be net of loads. So if you have a 100 Amp hour battery discharged to 75% and can give it net 5 Amps it will take all day (like six hours) to recharge because the solar can not provide full power from when charging starts to when it is finished. The charging is, as you know, not linear (unlike from mains where you can have constant current and/or Voltage for a stage as per makers recommendations). If the calculated peak charge rate is 5% gross and you draw off for running loads then the net drops below minimum; the batteries don't get fully recharged before sunlight runs out.

    The 10% rate allows you to get reasonable charging with some load allowance. If the loads are larger, you need additional capacity to offset their demand. And then there is the use of opportunity loads which improve system efficiency by utilizing PV power that would otherwise go unrealized when battery demand lessens. But that too is something of a management trick, and is not applicable to constant (or consistent) demand loads like running the refrigerator.

    I bet that didn't clear up anything. :p