Who to believe

erg
erg Registered Users Posts: 2
I have a small off grid solar system.
3 outback 200 AH batteries.
2 - 250 watt 12 volt panels.
MidNite Solar Classic charge controller 
Victron BMV 712 Smart monitor.
The Classic and the BMV rarely agree on voltage.
I used to have a TriMetric battery monitor and it always agreed with the Classic.
Who do I believe?
Which is giving the most accurate value?
The difference is not much, from 0.3 to 0.4 volts.
Might not seem important but it may make a difference in how long I can use power.
I understood that the voltage should never drop below 12.2 volts on a 12 volt system.

Comments

  • littleharbor2
    littleharbor2 Solar Expert Posts: 2,115 ✭✭✭✭✭
    You can adjust the voltage readout on the Classic. I ultimately did when my Tri metric, Trace SW inverter and Fluke meter pretty much agreed and the midnite was the farthest out. I don't remember the steps but it's in the menu if you search around.

    2.1 Kw Suntech 175 mono, Classic 200, Trace SW 4024 ( 15 years old  but brand new out of sealed factory box Jan. 2015), Bogart Tri-metric,  460 Ah. 24 volt LiFePo4 battery bank. Plenty of Baja Sea of Cortez sunshine.

  • softdown
    softdown Solar Expert Posts: 3,925 ✭✭✭✭
    I wouldn't be afraid of dropping below 12.2 volts in the face of adverse weather. It is why we use Deep Cycle batteries I think. 
    First Bank:16 180 watt Grape Solar with  FM80 controller and 3648 Inverter....Fullriver 8D AGM solar batteries. Second Bank/MacGyver Special: 10 165(?) watt BP Solar with Renogy MPPT 40A controller/ and Xantrex C-35 PWM controller/ and Morningstar PWM controller...Cotek 24V PSW inverter....forklift and diesel locomotive batteries
  • Marc Kurth
    Marc Kurth Solar Expert Posts: 1,174 ✭✭✭✭
    softdown said:
    I wouldn't be afraid of dropping below 12.2 volts in the face of adverse weather. It is why we use Deep Cycle batteries I think. 

    Agree, but it depends upon the battery. True deep cycle batteries are not destroyed by occasional discharges down to 80% depth of discharge or more. Multiple brands of deep cycle batteries provide curves showing how many charge/discharge cycles you can expect at various DOD all of the way down to 0%. The only hard break-point is pulling the batteries below 0% state of charge. That will cause damage.
    I always have more questions than answers. That's the nature of life.
  • littleharbor2
    littleharbor2 Solar Expert Posts: 2,115 ✭✭✭✭✭
    The only hard break-point is pulling the batteries below 0% state of charge. That will cause damage.
    Are you referring to a certain voltage or absolutely zero amp hours left to give?

    2.1 Kw Suntech 175 mono, Classic 200, Trace SW 4024 ( 15 years old  but brand new out of sealed factory box Jan. 2015), Bogart Tri-metric,  460 Ah. 24 volt LiFePo4 battery bank. Plenty of Baja Sea of Cortez sunshine.

  • erg
    erg Registered Users Posts: 2
    My batteries, Outback Northstar Blue, say in the manual to not go below 50% DoD which on their graph is 12.2 volts. I will follow their advice.
    This is all interesting but does not answer my question.
  • BB.
    BB. Super Moderators, Administrators Posts: 33,649 admin
    One place you can cause voltage measurement issues is the voltage drop in wiring... Under heavy current, the resistance of wiring can easily be in the range you are seeing... I.e., the difference between measuring on the battery bus terminals vs a device such as a charger on 4 feet of cable. In this case, the offset voltage/error would be higher during heavy charging (or loads) and drop to "zero" when there is little current flow/demand in the system.

    However, assuming that the difference you are seeing between two units taking their voltage measurements from the "same place" (A setup vs B setup)... It sounds like something else...

    One possibility, I have seen in the past that some charge controllers show "measured voltage" and others can show "corrected to 25C/77F" voltage on their meters... Meaning that the -5mV per degree C per Cell charging offset voltage "factor" result is displaced (i.e, for cold conditions, the displace meter reads "lower" and hot conditions the display reads higher. The idea being that the user is not "confused" by seeing changes in the Absorb Charging Voltage as the seasons and battery temperatures change. (This was years ago I had read about the offset issue--Don't know if still done today).

    Using a hand held DVM to measure voltages at different points in the system (under load/charging, hot/cold, etc.) can give you an independent confirmation if the differences in voltage measurements are real or not.

    Otherwise, as LittleHarbor2 says--More than a few folks have adjusted the (typically solar) charge controller readings to match with the actual measured voltages.

    -Bill
    Near San Francisco California: 3.5kWatt Grid Tied Solar power system+small backup genset