diy solar

diy solar

Settings for the MPPT for lithium LIFEPO4

Switched caoa
AFAIK, typical passive balancers operate on a differential voltage principle i.e. if Cell A is higher than Cell B, bleed some current off to Resister A. If all cells charge equally, no balancing occurs. I believe active balances work in a similar way but instead of bleeding current off to a resister, they instead somehow 're-route' the current to cells that are 'differentially' lower.

For the most part, these differential voltages only occur at the charge extremes i.e. nearly fully charged or nearly fully discharged but I don't think there is a setting i.e. start balancing at 14V.
Really balancing has nothing to do with a BMS. It’s really a charge function but as it needs cell connections it’s sometimes integrated into the BMS as a result. Personally I prefer separate balancers

Active top balances are typically switched capacitor, the capacitor is switched to the high voltage cell , th3 cap is charged and then switched to the lower voltage cell ( or cells ) where it discharges into that cell.
 
Thanks for the responses, that’s helpful.

I think I had presumed the BMS to have a great deal of sophistication and that the MPPT was just a dumb deliverer of current. I had assumed that the BMS would:
  1. Make sure that there was never more than 3.65V across any cell
  2. Regulate the current to the battery (if only by controlling the voltage so that it was never over 3.65V)
  3. Micro-balance individual cells (could take a lifetime) so that each cell had exactly the same voltage as its fellow cells
  4. Cut the current draw if it exceeded the rated value
  5. Keep the batteries at 14.2V
As I understand the explanations:
  1. The MPPT:
    1. is in charge of the voltage across the battery, never more than 14.2V
    2. drops the voltage across the battery to 13.5V at some point (not sure when), as LiFePO4s shouldn’t be stored at the charging voltage if you want to maximise the cycle life
  2. The BMS is moderately dim, and
    1. Cuts the input current to zero if it exceeds a rated amount (that of the BMS – or do I set this in the BMS?)
    2. Cuts the input current to zero if it exceeds a rated amount (that of the BMS – or do I set this in the BMS?)
    3. Micro-balances individual cells (could take a lifetime) so that each cell has exactly the same voltage as its fellow cells
Each BMS is rated:
  1. discharge current: 80A continuous, 150A max
  2. over-charge protection 10-50A (adjustable)
  3. balance current 0.5A
I don't really get what the over-charge protection is or why it's adjustable.

Have I got the MPPT/BMS roles correct?
 
Thanks for the responses, that’s helpful.

I think I had presumed the BMS to have a great deal of sophistication and that the MPPT was just a dumb deliverer of current. I had assumed that the BMS would:
  1. Make sure that there was never more than 3.65V across any cell.

This is true. The solution is to simply cut off all charging if any cell exceeds 3.65V

  1. Regulate the current to the battery (if only by controlling the voltage so that it was never over 3.65V

This is rarely true. I can think of one high end BMS, REC BMS and probably some others, that INTERFACE with the system to instruct the charger to taper charge to hold the high cell to 3.65V.

  1. Micro-balance individual cells (could take a lifetime) so that each cell had exactly the same voltage as its fellow cells

True enough.

  1. Cut the current draw if it exceeded the rated value

true if you mean "cut the current draw to 0A"...

  1. Keep the batteries at 14.2V

Many can be configured to also do it based on pack voltage, but it's the same as single cell - cut off charging to 0A if limit exceeded.

As I understand the explanations:
  1. The MPPT:
    1. is in charge of the voltage across the battery, never more than 14.2V
    2. drops the voltage across the battery to 13.5V at some point (not sure when), as LiFePO4s shouldn’t be stored at the charging voltage if you want to maximise the cycle life.

True enough. The charge termination criteria is variable, but with LFP, it tends to be very short.

  • The BMS is moderately dim, and
    1. Cuts the input current to zero if it exceeds a rated amount (that of the BMS – or do I set this in the BMS?)
    2. Cuts the input current to zero if it exceeds a rated amount (that of the BMS – or do I set this in the BMS?)
    3. Micro-balances individual cells (could take a lifetime) so that each cell has exactly the same voltage as its fellow cells

It's typically programmed in the unit as well as being an actual limit.

Each BMS is rated:
  1. discharge current: 80A continuous, 150A max
  2. over-charge protection 10-50A (adjustable)
  3. balance current 0.5A
I don't really get what the over-charge protection is or why it's adjustable.

Have I got the MPPT/BMS roles correct?

Yep.
 
Proper BMS should control all aspects of charging , charge sources should be dumb power supplies, sadly we have a transition from unmanaged lead acid charging tech and we are seeing this confusion
 
I wonder whether there is a view on the optimal Absorption current settings where I have three Victron MPPTs in parallel, charging two identical 608Ah parallel batteries. The MPPTs are all connected to a VenusGX.
  1. Should the individual MPPT Absorption settings presume that they are the only MPPT and each be set to C/100 = 6A? In this arrangement, somehow each MPPT would need to deduce (or be told by the GX) that there are other SCCs in parallel so that the Absorption cut-off is reached when the sum of MPPT currents falls below 6A.

  2. Since I have two parallel batteries, I presume I should set the Absorption current to 2C/100 = 12A. In this arrangement, there is an assumption that both batteries will hit the Absorption voltage/current at the same time and there can be no situations where one battery sees zero current and the other battery sees 12A. Does setting Absorption to 2C/100 risk under-charging?

  3. The power from my solar panels is highly variable, especially outside of Summer. This means that there will be periods where the batteries are not charged and the current available to the MPPTs from the panels will be swinging above and below the Absorption current. My reading of the manual suggests that the MPPT will pause the timer when there is insufficient input to deliver the Absorption current – is this correct?
Comments welcomed.

My settings are Absorption voltage 13.8V (3.45V/cell), and Float voltage 13.4V (3.35V/cell).
 
1 Yes

2 pretend your battery is one big battery and set accordingly. Ignore potential imbalances because you've followed best practices in building the bank.

3 There is no absorption current. There is an absorption voltage. Absorption period is the total time spent at the absorption voltage. If the voltage falls to the re-bulk level, it will start over.

All MPPT need to be set exactly the same manually.

They need to be placed in a VE.Smart network.

GX DVCC needs to be enabled with SVS, STS and SCS active identifying the devices you're using for shared sources (typically a BMV 702, 712 or smartshunt with temperature sensors preferably).

Once the above is enabled, each MPPT will independently use the shared current source (SCS) as their reference not their own supplied current.

Regardless of the number of sources, if you desire your tail current to be 6A, then you set all MPPT to 6A. When the SHUNT reads 6A, the MPPT will terminate absorption.

At 13.45V, your charge time will be highly similar to the 0.2C charge rate of lead-acid needing 5-7 hours of charging including a couple of hours of absorption, and you'll likely fall ever so slightly short of 100%. Float is more typical at 13.6V and will hold a battery at about 95%+.

Your shunt charged criteria should be 0.2V below absorption and your charged current should be slightly above your MPPT tail current.
 
That is really helpful, thank you. I hadn't twigged that the set-up might be managed using the voltage supplied by the SmartShunt. That is my next config job when on site.

Re "there is no Absorption current", I accept this as an absolute statement. I am lazily referring to the Tail Current. The position I am trying to get to is to prolong battery life by having a "low-ish" absorption voltage that is held for long enough that the batteries take up a full-ish (95%) charge. So my attention on the Tail Current is really just a theoretical yard-stick to say that when it is hit that the batteries are about full and they can roll into Float. I am also playing the same game with working out how long this Absorption time should be set for - in my case this is highly variable outside of the Summer.

My secondary consideration is that outside of high Summer, the power harvested fluctuates a lot and I would like to avoid scenarios where the batteries go into re-bulk more than once a day. I am perhaps being precious but it is about protecting the life of the packs whilst also being able to boil a few 1kW kettles when I want to, without kicking-off a re-bulk.

Thank you again.
 
Back
Top