How much power do you really need and for how long?
If you really did need 4,000 watts, that 200 amp hour 12 volt battery would not even run it for 30 minutes. But if you have a device like a fridge or A/C with a compressor, and it needs a surge to start, that 4,000 watt inverter may be needed to do the job, even if your constant draw is only 300 watts. I have a cheapo 700 watt 12 volt modified sine inverter, and it will run my fridge, but barely. A 1,000 watt would probably do the job just fine. And a good sine wave unit would be better for the motor, it will make a lot less heat and be more efficient.
Before you spend a lot of money on a good inverter, do a little math on the loads you want to run. Maybe invest in a Kill-O-Watt meter and see what they really draw. Anything with a motor, expect it to need about 5 times the power to get started. Low frequency inverters can usually handle double surge power (200%) for a few seconds. High frequency inverters are more like 150% surge, but only for a fraction of a second. My 700 watt inverter is just able to start my 230 watt fridge. Resistive loads like a hot plate and space heater are easy for an inverter, there is not surge. Large motors, especially starting a pump into pressure is where you have problems. Some older microwave ovens also had some pretty harsh start up load, but newer ones seem a lot better.
If you truly need 1,000 watts, and some of that is motors, I can see needing the 4,000 watt inverter to be reliable at getting everything started. A BMS is just like anything else. They have continuous ratings, and surge ratings. Not all of the specs will be published. My 200 amp JK BMS (also sold as Heltec) is rated to 350 amp surge current. In the setup app, I can set how much current it will take before it turns off on long term, and how many seconds I will let it pull more current before shut off. For 12 volt systems, the math is pretty easy. Figure 10 watts for every amp. Gives a decent derate for the efficiency and losses. 100 amps for 1,000 watts. 200 amp hours in a 12 volt system should run 1,000 watts for about 2 hours to complete discharge. A fridge should cycle, and only pull full power about 1/3 of the time, less if you don't open the door much, more if it is a pig like my old one here. Using the Kill-O-Watt, it can give you the real consumption over a period of time. I ran mine for 4 days to get a decent average, and realized how bad it is. But a new $2,000 fridge still won't pay off for 5 years.
Your constant run current should be under 50% of the rated current on a BMS if you want it to last. And your startup surge current should never exceed 200% of the constant rating under any conditions. Many do not list a surge rating. And those that do are rarely 200%. My JK Being rated to 350 amps on a 200 amp unit is 175% surge capacity. Most electronic components take time and heat up and fail. So they will usually take a short overload without permanent damage. It takes a second or so for the device to heat up. Even the fastest fuses have a time before they blow. The weak link in a BMS is the mosfets in the protection switch circuit. I just picked a random power FET from infineon and opened the data sheet. The IRFR120Z is rated for 8.7 amps continuous, but 35 amps pulsed. So it can take about 4 times the current for a short period of time. This is pretty typical of quality mosfets. Even cheap ones will do double. So if they actually use a bank of mosfets rated for 100 amps constant, it should survive a start surge of 200 amps. If you do go with the 4,000 watt inverter, you should have at least 200 amps of BMS to be on the safe side as the inverter could easily try to pull 400 amps when loaded up. 2,000 watts will pull 200 amps. My personal rule is that your constant current load should not exceed 100 amps. Yes, you can go higher, but the size of the wire, the fuses, and connectors, and any losses, just become a lot worse as the current climbs past 100 amps. At 1,000 watts a 12 volt cable will have 4 times the loss of the same cable running the same 1,000 watts at 24 volts. At 12 volts, not only do you have 100 amps of current, but a 1 volt drop is 1/12th of the voltage. At 24 volts, the current falls to 50 amps, so the same cable would only drop 0.5 volts, or just 1/48th of the system voltage. 100 amps at 1 volt is losing 100 watts of power. Losing 0.5 volts at 50 amps is just 25 watts of lost power.
Hope this helps