diy solar

diy solar

Midnite Solar Classic BMS communications

Bluedog225

Texas
Joined
Nov 18, 2019
Messages
2,917
I almost put this in the beginner’s corner. But it’s a little not beginnery.

I’m looking at the spec sheet for the Midnite Solar Classic. Trying it on for size.

My understanding of the need for communication between the battery management system and the MPPT controller is pretty basic. But I’ve reluctantly come to the conclusion that there are some charging benefits/nuances that are better handled by communications rather than voltage.

Looking at a Tier 1 product like the Classic, how can I tell if it is compatable with, for example, the Overkill BMS? I don’t see anything on the Midnite site or the Overkill site that says they both speak the same lingo (whatever that is).

BMS/MPPT communications eludes me. Is this something I have to run through a computer?

Should I be looking at other brands that handle this better?

Any thoughts appreciated.A9FDFB49-95C8-4F02-B7FF-9E8E7FC984C9.jpeg
 
I think it would be helpful for you to download the midnite manual. I think steve on this forum would also know answer if he sees this. I believe what you are asking about, if possible; is accomplished by using one of the two output relays and logic settings that midnite has.

Fyi, I am also using midnite classics, and I am not planning to have them interact with the bms, I was going to set my parameters slightly higher/lower on charge controller (and inverters) than inverter, so that these power sources cut themselves off before the BMS has to.

This way you have two layers of redundancy (or partial redundancy, the midnite classics have high temp cut off and temp sensor for BAT but no low temp cut - off ) your bms will need to do that for you if you use midnite scc.
 
I looked at the manual of course. But I’ll look again as you seem to be implying that it is covered in more detail than I realized. Hope Steve will check in.
 
I'm not aware of the Overkill BMS communicating directly with any SCC. While there are some situations that it would be useful (low temperatures, for one), in most cases it isn't really needed. Most mid-range and higher SCCs (and the Midnite Classic is one of the best) allow you to set a bulk / absorption voltage, a float voltage, an max absorption time, and maybe a tail current. That's really all you need to work with most any battery chemistry, including LiFePO4.

Any good BMS will prevent the MPPT/SCC from over-charging any one cell, which is mostly all you should have to worry about. If you want to live on the edge, it would be good if the SCC could throttle back the current as temps drop close to 0°F, but it's also OK to just have the Overkill cut off charging all together when it gets close.
 
I wish I could find a post I liked earlier. It described a more nuanced interaction that allowed real-time communications that was useful in correctly finishing off the charge cycle in a way that was better for the battery.

I guess I really don’t understand all the interest in communication between the charge controller and the BMS.

If everything works, it works. Why complicate it?
 
Hopefully this won't offend anyone...

I think that many folks here on the DIY Solar Forum - including myself sometimes - seem to try to over-engineer things. Make it more complicated. It may be mostly to try and get from 98% perfect to 99% perfect. I mentioned this before, but at times it seems like we are all a bunch of amateur witches trying to make a potion more complicated than the other witches, just to make it more complicated and mysterious.

On the other hand, 99% perfect is better than 98% prefect. ;)
 
The thing is, there is a lot of ink being spilled on communication capability but little I have seen on exactly what is being communicated. Or I am still trying to find the when, where, what, why that is not the secret sauce of vertically integrated companies. Maybe we (DIYers don’t care much because voltage control and some basic controller parameters get us where we need to be.

Put another way, if it’s good enough for Midnite, it’s probably good enough for me. Though I would like to understand it better.
 
The thing is, there is a lot of ink being spilled on communication capability but little I have seen on exactly what is being communicated. Or I am still trying to find the when, where, what, why that is not the secret sauce of vertically integrated companies. Maybe we (DIYers don’t care much because voltage control and some basic controller parameters get us where we need to be.

Put another way, if it’s good enough for Midnite, it’s probably good enough for me. Though I would like to understand it better.
There is quite a bit of recognized value in having a "closed loop" operation between an inverter / charger and a battery. Maybe that is where the ink is being spilled. Not so much with the SCC as far as I know.
 
The thing is, there is a lot of ink being spilled on communication capability but little I have seen on exactly what is being communicated. Or I am still trying to find the when, where, what, why that is not the secret sauce of vertically integrated companies. Maybe we (DIYers don’t care much because voltage control and some basic controller parameters get us where we need to be.

Put another way, if it’s good enough for Midnite, it’s probably good enough for me. Though I would like to understand it better.

The key advantage is having the ability to have a resistive balancing function of the BMS operating at a current that is reduced to allow cells time to balance.

Other advantages are the ability to prevent high / low temperature charging, and ramping up / down charge current near knee voltages.

The obvious ones of integrated back up (generator/supplementary system), and low SOC warnings.

Thats just off the top of my head, my systems are all integrated BMS/Inverter/Chargers - i wouldn’t have it any other way.
 
Midnite Classic does not interact with a BMS in any way. I suppose you could write a Nodered Application, connect with the MODBUS through TCPIP , read the data and send profile changes (extremely hazardous NOT for amateurs) to mod the functions. The BTS (Batt Temp Sensor) can still tread "ambient temps" which could also be read by a Nodered app which in turn could trigger relays or what have you.

The usual Config for Classics with LFP, is to set the charging params and of course use FLOAT which "will" saturate cells and allow for cell balancing by BMS if so equipped. See additional info here: https://diysolarforum.com/resources/midnite-solar-lithium-battery-configuration-info.116/

hope that answers some questions
Steve
 
Midnite Classic does not interact with a BMS in any way.

You can get the midnite classic to be controlled by the REC-BMS if you use it DC coupled to a SMA SunnyIsland inverter and include the Midnite / SunnyIsland communication module.
 
I dunno anything about the combox... There's more info & discussion on all of this at http://midniteftp.com/forum/index.php and such questions are likely better there... They also watch closely and pickup things that are needed/wanted. I love the folks at Midnite they really are Great but they are not the speediest by any means when it comes to putting product or even updates out.
 
There is quite a bit of recognized value in having a "closed loop" operation between an inverter / charger and a battery. Maybe that is where the ink is being spilled. Not so much with the SCC as far as I know.
I don’t understand how the need to interact with a inverter/charger would be different from the need to interact with a solar charge controller.
 
I don’t understand how the need to interact with a inverter/charger would be different from the need to interact with a solar charge controller.
Well, remember I said that many folks here tend to over-engineer things. However, with that as a basis: Generally, if the BMS cuts off the connection to (or more specifically the discharge to) the inverter, bad things can happen. It would be far better if the Inverter were told by the BMS to do the equivalent of a low voltage cut-off, BEFORE the BMS has to cut off all voltage. Some SCCs have explicit warnings that their circuitry will be destroyed if the battery is disconnected while a PV input voltage is present.

Both the SCC and the Inverter/Charger (emphasis on CHARGER) can also get some info from the BMS to cut back on charging amperage when the cells are getting up into the upper knee of the charge curve. I personally think this is one of the "amateur witch" things, as the cells will normally take care of this themselves if you have the bulk / absorption voltage set right (3.45V-3.55V per cell). They current accepted by the cells will drop precipitously as the cells reach full charge.

@toms seems to be saying that the charge current can / should be adjusted in the lower knee too, but I don't get that. He will have to explain that aspect of his "potion". ;)
 
Well, remember I said that many folks here tend to over-engineer things. However, with that as a basis: Generally, if the BMS cuts off the connection to (or more specifically the discharge to) the inverter, bad things can happen. It would be far better if the Inverter were told by the BMS to do the equivalent of a low voltage cut-off, BEFORE the BMS has to cut off all voltage. Some SCCs have explicit warnings that their circuitry will be destroyed if the battery is disconnected while a PV input voltage is present.

Both the SCC and the Inverter/Charger (emphasis on CHARGER) can also get some info from the BMS to cut back on charging amperage when the cells are getting up into the upper knee of the charge curve. I personally think this is one of the "amateur witch" things, as the cells will normally take care of this themselves if you have the bulk / absorption voltage set right (3.45V-3.55V per cell). They current accepted by the cells will drop precipitously as the cells reach full charge.

@toms seems to be saying that the charge current can / should be adjusted in the lower knee too, but I don't get that. He will have to explain that aspect of his "potion". ;)
I forgot to mention that @toms point about slowing the charge (reducing the current) near the end to give the BMS an opportunity to balance the cells near the full charge point is a good one. Much better reason than anything else I've heard about why an SCC and BMS would communicate. The only issue I would have with it: How are existing loads that are taking advantage of the PV harvest affected? In a system with no communication between the charge controller and the battery, the SCC simply provides as much as it can, limited by what current everything will take, including both the battery and whatever loads there at the time. If the BMS says it wants less current (even though the attached battery will happily take more), does the SCC simply cut the current back to that point? What happens to the loads?
 
I have 3 x Midnite Classic 150s. Their main purpose is to convert PV -> battery charging. I use a Batrium BMS and it's main purpose is to monitor the battery packs and overall operating characteristics closely and disconnect the battery from the system if something is out of whack.

Fundamentally, these are 2 separate concepts/operations that are independent from each other. While I could see the value in an integrated system for overall monitoring and even some one-touch operational control settings - this would be a secondary thing.

Secondary control....
I use one of the Midnite's AUX1 relay to turn on/off the inverters by voltage - which controls the cut-off voltage. You could say I control the overall voltage range (SoC and DOD) thru the Midnites since they do both max charging and min discharge voltage levels. But any user settable charge controller will let you set the max charge and any voltage aware relay (or even the inverters themselves) can turn inverters on/off based on min voltage.

I use the BMS (Batrium) to monitor and record battery info and occasionally do touch-up balances. This gives me trends and operational options over time. Batrium also allows all kinds of nuanced settings such as balance, min/max voltages, min/max temps, min/max 'overall' amps etc etc - but this is all secondary.
 
Last edited:
Offgrid-I’m curious what happens on a cold, sunny day when the BMS shuts down the batteries and the controller is left in an extended high voltage condition.

Are Midnites uniquely able to handle this? It seems to violate the rule of “hook up the batteries first.”

Or can the Midnite tell itself to shut down the input from the panels before the battery BMS calls it quits?
 
Back
Top