diy solar

diy solar

Does anyone make a simple MPPT with on/off?

johntaves

New Member
Joined
Mar 31, 2020
Messages
221
My LiFePo4 battery monitor will close a relay to turn on/off the charging. I am wondering if there is a simple MPPT that has no user interface at all.

It just needs to keep the voltage of the panels separate from the battery and attempt as high as possible voltage on the battery.

I suspect many will tell me that the voltage current needs to taper to fully charge the cells, and therefore the MPPT needs settings to determine what type of battery I have. This is not really necessary. The BMS must be able to turn off the charger regardless of the battery voltage, and therefore the MPPT can shove all the amps it can at the battery when the BMS provides the "on" signal. When those amps will drive the battery voltage too high, the BMS will stop it.

My BMS normally stops at 90% SOC, but it needs to run up to the max periodically to clear any accumulated SOC error. So, I do occasionally need to taper the current, but the Sun does that every day for us.

I would think that making an MPPT that has no user interface at all would be cheaper than one that has one. Is there such a thing?

jt
 
Last edited:
Well, yes they have solar controllers that do that, but they by definition are not MPPT. The most basic solar controllers are just a relay or FET that connects the panels directly to the battery, then disconnects them when the battery voltage reaches a charged level.


In most installations the BMS is a safety device. The charging devices are setup such that the BMS does nothing. And then when the charging device fails and doesn't stop charging on its own, the BMS steps in to prevent damage. But for years, you could have no BMS at all until the charger failed or the user did something dumb. There is also an advantage and a reason for an LFP battery to have a CV absorption phase, to charge at bulk until ~3.45Vpc is reached, then hold that voltage until tail current reaches a certain level. A good MPPT can do this.

You note the accumulated error on the BMS. Do you really want a device that might be 10% off to tell your charger when to stop charging? The MPPT is equipped to fully charge the battery (if properly setup) and the BMS would detect this and reset.
 
Well, yes they have solar controllers that do that, but they by definition are not MPPT.
As stated in the title, I am looking for an MPPT. I am already using a simple relay to connect/disconnect the panels (Electrodacus DSSR20). I am interested in both comparing with the MPPT to see if it gains anything and also running 2 panels in series which would double the voltage and thus move the MPP out of whack.
There is also an advantage and a reason for an LFP battery to have a CV absorption phase, to charge at bulk until ~3.45Vpc is reached, then hold that voltage until tail current reaches a certain level. A good MPPT can do this.
As stated in my post, the Sun does this for free. OK, OK, a properly spinning Earth relative to the Sun does this for free.
You note the accumulated error on the BMS. Do you really want a device that might be 10% off to tell your charger when to stop charging? The MPPT is equipped to fully charge the battery (if properly setup) and the BMS would detect this and reset.
I did not intend to imply that the BMS only monitors SOC. As stated above, it monitors each cell and will shut of charging or loads if any one cell is too high or too low. The SOC is not terribly important, and thus the error is no big deal. The point of monitoring SOC is to reduce the amount of time that the cells are near 3.5. Frankly, I wish my phone and laptop would stop charging at 90% so I don't have to replace those batteries so damn often.
 
It just needs to keep the voltage of the panels separate from the battery and attempt as high as possible voltage on the battery.
MPPT controller without ability to limit maximum battery voltage would not be commercially viable and that's why no one makes it.
 
Last edited:
As stated in the title, I am looking for an MPPT. I am already using a simple relay to connect/disconnect the panels (Electrodacus DSSR20). I am interested in both comparing with the MPPT to see if it gains anything and also running 2 panels in series which would double the voltage and thus move the MPP out of whack.

As stated in my post, the Sun does this for free. OK, OK, a properly spinning Earth relative to the Sun does this for free.

I did not intend to imply that the BMS only monitors SOC. As stated above, it monitors each cell and will shut of charging or loads if any one cell is too high or too low. The SOC is not terribly important, and thus the error is no big deal. The point of monitoring SOC is to reduce the amount of time that the cells are near 3.5. Frankly, I wish my phone and laptop would stop charging at 90% so I don't have to replace those batteries so damn often.
Yes, an MPPT is more efficient, and you get more out of the panels than with a simple relay, if that is what you are asking. Also, 2 panels in series with an MPPT gives you a slightly longer day, as the controller will "turn on" earlier in the day, and "off", later in the evening.

A spinning Earth will not provide a CV absorption charge. The only way to do that is with a controller that supports an absorption phase.

Most BMS's don't have a way other than SOC/coulomb counting to know if a battery is fully charged. Voltage alone won't tell you that. You can overcharge an LFP at lower than 3.65Vpc. A charge controller that monitors tail current will do a better job at stopping charge at the correct time. Setting the absorption voltage lower with a longer absorption time gives the BMS more time to balance the cells before the LFP is fully charged. It also (according to many) extends the life of the battery by fully charging it without reaching 3.6V. That is why the settings in an MPPT are adjustable, to tweak that how the user wants that to happen.

If the MPPT was just charging as hard as it can until the BMS stops it, you would be charging in bulk mode until 3.65V was reached, with no absorption. That is hard on an LFP battery. It's much better to hold an absorption voltage of 3.5ish until the LFP is fully charged, without ever getting to 3.56V

MPPT controllers are cheap. PWM even cheaper. I'm not sure what you would be gaining by making a cheaper one. Less reliability?
 
Running as you are proposing with an MPPT set to run flat out and relying on the BMS to stop charging is designing a system where a single point of failure leads to disaster. If the BMS flops and doesn't stop the MPPT, you'll end up with overcharged cells and possibly a fire.

This is why it's a more normal practice to program the MPPT with CC/CV parameters and then also have a BMS that can disconnect the pack. If one fails, the other is there to protect the cells.
 
My LiFePo4 battery monitor will close a relay to turn on/off the charging. I am wondering if there is a simple MPPT that has no user interface at all.

It just needs to keep the voltage of the panels separate from the battery and attempt as high as possible voltage on the battery.

I suspect many will tell me that the voltage needs to taper to fully charge the cells, and therefore the MPPT needs settings to determine what type of battery I have. This is not really necessary. The BMS must be able to turn off the charger regardless of the battery voltage, and therefore the MPPT can shove all the amps it can at the battery when the BMS provides the "on" signal. When those amps will drive the battery voltage too high, the BMS will stop it.
Nope. That's CC charging. You want CC/CV charging if you are going to get the most out of your battery.

You also do NOT want your BMS to stop charging! Your BMS is a last ditch "save the battery and prevent a fire" measure. You do NOT want to be using it every day to terminate charge. You will damage your battery doing that.

If you have communication between your BMS and your charge controller, this can happen automatically. Otherwise you have to set it.
 
Yes, an MPPT is more efficient, and you get more out of the panels than with a simple relay, if that is what you are asking. Also, 2 panels in series with an MPPT gives you a slightly longer day, as the controller will "turn on" earlier in the day, and "off", later in the evening.
An MPPT is not always more efficient. You'll notice they generally have heat sinks, that means they throw away energy. The SSR's I have produce almost no heat. One needs to prove that the MPPT will be worth the $ and it depends upon the use case, batteries, temperature, and the panels.

If you are not grid tied, then you probably have excess power most of the time so what is the point of blowing money to squeeze out a bit more sometimes? For example, this past week at Steamboat rock, there's no way I could use all the power, so an MPPT would be a total waste of $. If it was during a heat wave my cells won't keep up with the A/C. I will find an outlet. There is a huge range of temperatures and weather where I have no need for all the power, and there is a very thin range of weather where the MPPT will make a difference.

It is much easier to justify the expense and complexity of an MPPT if you are grid connected and thus never wasting sunlight and you gain something by not matching the cells to the battery. I am not grid connected. I am throwing away power at a furious rate. I don't need to throw away money on an MPPT.
A spinning Earth will not provide a CV absorption charge. The only way to do that is with a controller that supports an absorption phase.
The Sun's power nicely tapers off every day. Whenever my BMS is triggered to bring the pack to 100%, it nicely monitors every cell's voltage and the pack current and ensures it does not over charge any cell, and properly declares the pack 100% full when one cell reaches 3.5 and the amps are less than X.

It makes no sense that somehow the free tapering the Sun does every day is bad but yet manages to supply the power to enable the charge controller to do some pristine tapering. If you have the excess power to flawlessly taper, then you don't need an MPPT, you are throwing away sunshine.
Most BMS's don't have a way other than SOC/coulomb counting to know if a battery is fully charged. Voltage alone won't tell you that. You can overcharge an LFP at lower than 3.65Vpc.
I agree you can overcharge one or more of the cells by charging a pack to 3.65xN (where N is the number of cells), but I don't agree that you can overcharge a cell by ensuring it stays lower than 3.65V. Additionally, I don't see the point of ever going to 3.65V. The voltage rises exponentially after about 3.4, so I set mine to stop at 3.5.

Yes, I totally agree that most BMS's and charge controllers suck because they cannot monitor SOC, and thus relentlessly bring the battery to 100% every day. My BMS does monitor SOC and does stop at n% and thus rarely ever brings any cell past 3.4V.
A charge controller that monitors tail current will do a better job at stopping charge at the correct time.
What is the "correct time"? What is "better job"? Again, my BMS always prevents any cell from exceeding xV. It is correct to stop charging if any one cell hits x (I set my x to 3.5V). Are you arguing that it's better to maintain steady decreasing amps, instead of a more on/off with decreasing amps? Ok, then what are you doing to ensure the Sun keeps shining to maintain that? And given that the BMS/Charger doesn't monitor SOC and thus doesn't let you stop at x%, then this is happening every day. And again, if the controller can do this, it must have excess power and thus the MPPT functionality is rather pointless.
Setting the absorption voltage lower with a longer absorption time gives the BMS more time to balance the cells before the LFP is fully charged.
My BMS will turn on the balance resistor that drains a given cell's power at any time, not just while charging and near the top. The BMS monitors what cell arrived at the max and gives that cell a haircut so that a different cell arrives at the peak next time.

Do you really know how balanced the cells are? Do you know whether or not one cell is always getting to the top before the others, and thus that cell is taking abuse?

I assume these plug n play Li batteries like battleborn, will stop the charge when one cell has max V, but still allow the loads. If that is true, do you know how often this is happening? You'd have to monitor the charge controller to see that it is proving amps that are all going to the loads, yet the amps are not flowing into the battery.
It also (according to many) extends the life of the battery by fully charging it without reaching 3.6V. That is why the settings in an MPPT are adjustable, to tweak that how the user wants that to happen.
It extends the life of a lead acid battery to keep at at 100%, I am not aware that it extends LiFePo battery life to do this. My understanding is that the more often it is above say 3.4 (or below whatever), the more stress you are putting on it.
If the MPPT was just charging as hard as it can until the BMS stops it, you would be charging in bulk mode until 3.65V was reached, with no absorption. That is hard on an LFP battery. It's much better to hold an absorption voltage of 3.5ish until the LFP is fully charged, without ever getting to 3.56V
Same as above.
MPPT controllers are cheap. PWM even cheaper. I'm not sure what you would be gaining by making a cheaper one. Less reliability?
A SSR is even cheaper and even more reliable. An MPPT with a simple SSR type circuit to follow the on/off command of the BMS is more reliable than one with a microprocessor that can only measure pack voltage.

If I want to try out an MPPT, then I have to find one that does not rely on a microprocessor and communication to shut off, because I do not have an expensive and thus unreliable BMS that runs all the amps through it.

We do not gain reliability with more complex circuits and big mosfets.
 
Running as you are proposing with an MPPT set to run flat out and relying on the BMS to stop charging is designing a system where a single point of failure leads to disaster. If the BMS flops and doesn't stop the MPPT, you'll end up with overcharged cells and possibly a fire.

This is why it's a more normal practice to program the MPPT with CC/CV parameters and then also have a BMS that can disconnect the pack. If one fails, the other is there to protect the cells.
The flaw with this thinking is that all of this is more complex. My loads (DC/DC and inverter) and chargers each have a simple on/off switch. If the circuit is open the device is off. If it is closed, it is on. None are relying on microprocessors and measurements to decide on/off.

I can easily deploy 2x of my BMS so that each on/off signal is run through both such that both BMS need to agree to the on state for the device to be on.

I am not saying that I have made the most reliable system. I am saying that it is simpler. It is harder to argue something that is more complex is more reliable.
 
You've just shifted all the logic to your BMS.
I don't think the MPPT you are looking for exists.
 
Nope. That's CC charging. You want CC/CV charging if you are going to get the most out of your battery.

You also do NOT want your BMS to stop charging! Your BMS is a last ditch "save the battery and prevent a fire" measure. You do NOT want to be using it every day to terminate charge. You will damage your battery doing that.

If you have communication between your BMS and your charge controller, this can happen automatically. Otherwise you have to set it.
This is the situation everyone has learned because all the equipment is descended from lead acid. With lead acid you monitor the pack voltage and you need to run the battery up to 100%, and keep it there as much as possible, which is determined by the voltage.

With lithium multi cell packs, it is completely opposite. You need to monitor every cell. Total pack voltage is rather useless. 100% is determined by once cell arriving at max voltage below some minimum amps. The more time near 100% the less life you get from the pack. SOC, and thus amp counting, is required to charge to 90%. All of this makes the current equipment a pathetic hack.

Can you set your system to stop at 90%? Do you know if one cell is constantly the victim of reaching max V every time? The BMS has the information that you and the rest of the system needs. The loads and chargers do not have that information. Of course you choose an MPPT that will do CC/CV, that's the only choice. But it makes no sense to have a controller that knows nothing about individual cell voltages controlling the charge.

Putting the charge/load smarts outside of the battery pack is just wrong for Li.

If this equipment was designed from scratch today, the BMS monitors the cells and amps and SOC and provide simple open/close circuit to turn on/off loads and chargers. It will provide the user interface for the system. The chargers and loads would have a two wire on/off control from the BMS. You wouldn't have redundant user interfaces on the inverter and MPPT and BMS. There wouldn't be competing communication protocols.

The BMS would handle any amperage and capacity battery. For DIY packs, the cells could come with a standard circuit that provides the cell information to the BMS via BLE. The BMS would monitor the cells and be able to turn on the cell's drain resistor whenever it wants. The different BMSs would be differentiated by how fancy they were. You could have many output relays so that you can turn on/off the furnace to keep the batteries warm, or turn on the electric water heater to take excess power so the gas isn't used to heat the water. A BMS could check the weather forecast and use that to decide whether to run the battery to higher SOC and stop loads that are for when you have excess power. One BMS might be a client on a wifi/ethernet providing a web UI. Another might deliver the UI via bluetooth.

MPPT with bluetooth, why? Inverter with remote control, why? The BMS should provide that UI because it has the info. It knows the cells.

Oh! Maybe there are grid tied MPPTs that don't have the complexity of the battery crap in them. Maybe that's available.
 
So if I am to understand you are proposing the the BMS become a MPPT SCC as part of its function. I suppose it could be done. You would then just plug your panels directly to the battery bus. I guess you could also wire it up to act as an inverter charger with ATS capability as well as a display screen for setting parameters.

Of course all this might make the BMS so large it needs active cooling and a large cabinet to house it rather than being able to put it alongside the battery.

In regards to constant current and constant voltage. The SCC does its best with what PV provides. As you point out it is not perfect.
 
So if I am to understand you are proposing the the BMS become a MPPT SCC as part of its function. I suppose it could be done. You would then just plug your panels directly to the battery bus.
Yes, and no. Right now my panels are effectively connected straight to the battery. There's an SSR (effectively a relay) to connect/disconnect them. That's really the only SCC that is needed. The MPPT part is a totally separate functionality from the BMS. It needs to set the voltage the panel sees, independent of the battery voltage, to get the max power from it. Right now my panels always see roughly 27V which, depending on the temperature of the cells, could be off of MPP by quite a bit. I want an MPPT, but I do not need an SCC.
I guess you could also wire it up to act as an inverter charger with ATS capability as well as a display screen for setting parameters.

Of course all this might make the BMS so large it needs active cooling and a large cabinet to house it rather than being able to put it alongside the battery.
No, the idea is to be able to mix and match the components necessary for the use case. The BMS is independent of the battery size and amps and independent of the loads and chargers. The one in the photo can handle any number of cells in a pack, so any pack voltage and any pack amperage. (The circuit on the right is bolted to each cell and delivers cell voltage and temperature via bluetooth to the BMS on the left).

The battery pack is sized for what I want. The shunt that provides amp info is sized according to the system amps. The inverter is sized according the needs. My inverter was a cheap WZRELB inverter where I spliced into its power switch controlled by the BMS, but I have a larger LF one on the way. I have a separate 40A power supply that does the charging, again with its on/off controlled by the BMS. I used to have 4 SSRs to control 4 sets of panels. Now I have 7. I have changed the loads and chargers and doubled the battery size, yet the BMS remains unchanged. (The browser UI is shown below)
20230708_122007.jpg
Untitled.png
 
An MPPT is not always more efficient. You'll notice they generally have heat sinks, that means they throw away energy. The SSR's I have produce almost no heat. One needs to prove that the MPPT will be worth the $ and it depends upon the use case, batteries, temperature, and the panels.

If you are not grid tied, then you probably have excess power most of the time so what is the point of blowing money to squeeze out a bit more sometimes? For example, this past week at Steamboat rock, there's no way I could use all the power, so an MPPT would be a total waste of $. If it was during a heat wave my cells won't keep up with the A/C. I will find an outlet. There is a huge range of temperatures and weather where I have no need for all the power, and there is a very thin range of weather where the MPPT will make a difference.

It is much easier to justify the expense and complexity of an MPPT if you are grid connected and thus never wasting sunlight and you gain something by not matching the cells to the battery. I am not grid connected. I am throwing away power at a furious rate. I don't need to throw away money on an MPPT.

The Sun's power nicely tapers off every day. Whenever my BMS is triggered to bring the pack to 100%, it nicely monitors every cell's voltage and the pack current and ensures it does not over charge any cell, and properly declares the pack 100% full when one cell reaches 3.5 and the amps are less than X.

It makes no sense that somehow the free tapering the Sun does every day is bad but yet manages to supply the power to enable the charge controller to do some pristine tapering. If you have the excess power to flawlessly taper, then you don't need an MPPT, you are throwing away sunshine.

I agree you can overcharge one or more of the cells by charging a pack to 3.65xN (where N is the number of cells), but I don't agree that you can overcharge a cell by ensuring it stays lower than 3.65V. Additionally, I don't see the point of ever going to 3.65V. The voltage rises exponentially after about 3.4, so I set mine to stop at 3.5.

Yes, I totally agree that most BMS's and charge controllers suck because they cannot monitor SOC, and thus relentlessly bring the battery to 100% every day. My BMS does monitor SOC and does stop at n% and thus rarely ever brings any cell past 3.4V.

What is the "correct time"? What is "better job"? Again, my BMS always prevents any cell from exceeding xV. It is correct to stop charging if any one cell hits x (I set my x to 3.5V). Are you arguing that it's better to maintain steady decreasing amps, instead of a more on/off with decreasing amps? Ok, then what are you doing to ensure the Sun keeps shining to maintain that? And given that the BMS/Charger doesn't monitor SOC and thus doesn't let you stop at x%, then this is happening every day. And again, if the controller can do this, it must have excess power and thus the MPPT functionality is rather pointless.

My BMS will turn on the balance resistor that drains a given cell's power at any time, not just while charging and near the top. The BMS monitors what cell arrived at the max and gives that cell a haircut so that a different cell arrives at the peak next time.

Do you really know how balanced the cells are? Do you know whether or not one cell is always getting to the top before the others, and thus that cell is taking abuse?

I assume these plug n play Li batteries like battleborn, will stop the charge when one cell has max V, but still allow the loads. If that is true, do you know how often this is happening? You'd have to monitor the charge controller to see that it is proving amps that are all going to the loads, yet the amps are not flowing into the battery.

It extends the life of a lead acid battery to keep at at 100%, I am not aware that it extends LiFePo battery life to do this. My understanding is that the more often it is above say 3.4 (or below whatever), the more stress you are putting on it.

Same as above.

A SSR is even cheaper and even more reliable. An MPPT with a simple SSR type circuit to follow the on/off command of the BMS is more reliable than one with a microprocessor that can only measure pack voltage.

If I want to try out an MPPT, then I have to find one that does not rely on a microprocessor and communication to shut off, because I do not have an expensive and thus unreliable BMS that runs all the amps through it.

We do not gain reliability with more complex circuits and big mosfets.
You are making some fundamental errors about how charging takes place. A battery is fully charged when it can not safely hold any more energy, not at a specific voltage. The resting voltage of fully charged LFP is ~3.4Vpc. Any voltage over that will result in current flowing into the cell, resulting in an overcharge if charging is not stopped when the cell is full. This is well established and demonstrated. If you hold voltage (CV charging) as the cell charges, current will decrease. The charger doesn't do this, the cell does. All the charger does is hold voltage. When the current drops to a certain level, the cell is deemed to be fully charged. Some companies like Victron us a timed absorption because a cloud passing by might lower current enough to stop charging early. If you charge at 3.45Vpc, the current will be lower than if you charge at 3.65Vpc. Thus, it will take much longer to fully charge a cell at 3.45Vpc then it would at 3.65Vpc. That is why 3.65Vpc has no absorption time, but anything lower will need an absorption charge.

Higher quality chargers and MPPT controllers (like victron) can use an external shunt to measure current going into the battery vs supplying loads.

As you mentioned, LFP doesn't "want" to be at higher voltages. So you extend the life with a charger that has a lower voltage and CV/absorption cycle. Charging to 3.65Vpc is hard on the battery. But charging to 3.5Vpc and using an absorption phase will still fully charge the battery. There is nothing wrong with stopping at 3.5Vpc as you do, but you are not fully charging the battery. This isn't the same as the sun going down. The sun tapering off everyday lowers the voltage, so unless you have reached absorption long before the sun goes down, voltage will drop below the cell voltage and stop charging early. This doesn't harm the cell at all, but doesn't result in a full charge.

MPPT controllers absolutely are more efficient than simpler types. The heat generated is minimal compared to the additional wattage produced by the solar panels by keeping them at the maximum power point.

The BMSs job is to monitor cells voltages, and disconnect either charging or the load if voltages go out of range. Most BMSs also have a balance, but that is really a separate task, to keep the cells all at the same level of charge (NOT the same as voltage!) The BMS really doesn't know how to charge, what voltages to use, how long to absorb, or any of that. That is one advantage of a more intelligent charger.

If your BMS is starting balancing at below 3.4Vpc, then it is just as likely pulling the cells out of balance as it is balancing them. Balancing ONLY works at higher voltages when the cells are in the steep part of the voltage curve. I check on the BMS about monthly. In 3 years I have never had a high voltage cutoff event. I had 1 event when I left the boat for 2 weeks and it rained the entire time and drained the LFP to the low voltage cutoff. The cells have never gone out of balance, and are within 0.002Vpc at full charge.

I am on a sailboat, completely off grid and 100% dependent on solar. That is why efficiency of the controller matters to me, and why I want to get as much energy into my battery as I can. Over the years I started with a simple relay type controller, then PWM, then MPPT. First on FLA then on LFP. The difference in charge current and charge times between a simple relay and the MPPT was pretty dramatic.

Many(most?) MPPTs do not require any communication to work properly. Mine does not communicate with the BMS. I do have multiple MPPT controllers for 2 different strings of panels, connected to a shunt to monitor charge current at the battery separate from loads.
 
The sun tapering off everyday lowers the voltage, so unless you have reached absorption long before the sun goes down, voltage will drop below the cell voltage and stop charging early. This doesn't harm the cell at all, but doesn't result in a full charge.
This is not correct according to trivial googling. For example "current dramatically changes as irradiance varies, but voltage remains relatively constant"

Again, there is no need for the SCC to taper the amps. The Sun does this for free every day.

Maybe a more complete description would help. I set my BMS to define 100% as any one cell hitting 3.5V and amps less than 12. It is set to stop at 90% SOC, which in my pack is before any cell is above 3.4V. Every week or so, it attempts to run up to 100% to clear the SOC error. To do that it turns on the PV, and always stops charging if any one cell is above 3.5V. If the amps were more than 12, it leaves the PV off until all the cells are below 3.4V, then turns PV back on. If this cycling is happening afternoon, then the amps are generally lower for each cycle. If at the next shut off moment the amps are less than 12, it declares the battery to be at 100%. Now that it has corrected the SOC drift, it resumes where charge is shut off at 90% SOC and turned on at 85%.
MPPT controllers absolutely are more efficient than simpler types. The heat generated is minimal compared to the additional wattage produced by the solar panels by keeping them at the maximum power point.
In order to prove that an MPPT is "absolutely more efficient" you'd have to know the use case (see above in a reply to your similar statement), and you'd have to know the panel's MPP and temperature and the battery voltage.

I would like to measure this on my setup by putting an MPPT in to compare. But I cannot do that if I do not find an MPPT that will take a simple open/close signal to shut off. I could put it upstream of my SSR, but generally MPPTs do not like having PV power, but no battery power.
The BMS really doesn't know how to charge, what voltages to use, how long to absorb, or any of that. That is one advantage of a more intelligent charger.
I don't see the point of defining "BMS" and "charger" this way. As stated above (see "This is the situation everyone..."), this thinking is derived from the current product offerings that are generally all derived from lead acid. The combination of free amp tapering every day from the Sun, and the fact that the charger knows nothing about the individual cells, makes "intelligent" a very dubious claim. The only "intelligence" that I need is the circuit that hunts for the MPP.
If your BMS is starting balancing at below 3.4Vpc, then it is just as likely pulling the cells out of balance as it is balancing them. Balancing ONLY works at higher voltages when the cells are in the steep part of the voltage curve.
You are describing how most BMSs do the balancing. I described how mine does something different above. Search for "haircut" and reread it.
 
I use a MPPT charger that wholly relies on the BMS for its operation.

It is a SMA Sunnyboy.

You are correct in saying with LiFePO4 the BMS should control the charger, and there are many advantages to this which have been discussed ad nauseum.

The cheap ebike BMS’ that most people use, while great for the application of an ebike, are a compromise for a continuous use system and work better with the charger operating inside the BMS parameters.

The best way to charge a LiFePO4 cell is to hold 0.05C current until the cell reaches 3.5V, then remove the charge source. Once again this is impractical for a continuous use system, so after this step the charger then holds constant voltage at 3.4V allowing it to supply the load.

Back to your question, the AC coupled inverters MPPT does what you are asking - it will supply maximum power until commanded by the BMS to limit (or stop).
 
Back
Top