How do you calculate this? Just trying to learn something today...
Sure, here you go! Suppose that your "100 Watt" panel is rated to deliver 5.29A at V(mp) of 18.9 Volts. (Those are both typical values for 100W panels, though a few vary the balance of Voltage versus Current by a bit. Bigger panels generally run at higher V(mp), and the advantage for MPPT becomes greater.) When you use a PWM controller, you can't increase the current - you VERY rapidly disconnect and reconnect the panel, so that the average Voltage into your batteries (over a longer period of time, such as 1/50 of a second) is only your desired charging Voltage. Suppose that to be 14.4 Volts.
Under excellent sunlight, the current will go up upon each reconnection, but only by a tiny bit and only for a moment. At first approximation, we consider the current under STC to be nearly constant. And so, in PWM, we still have the 5.29A, but we're now delivering power at only 14.4V. 14.4V * 5.29A = 76 watts, rather than the original 100 watts.
- - -
A Perfect MPPT would converter all the "100 Watt" power into extra current: 100W divided by 14.4V would be 6.94 Amps. But in practice, they're typically only about 93-94% efficient. 76 watts into the batteries, instead of 94 watts into the batteries, is right around 80% of the MPPT power capability - roughly a 20% loss.
- - -
The MPPT advantages can be improved by wiring panels in Series (at 2x 3x, or even 4x the original Voltage amounts). The Voltage When converting a much larger voltage differential MPPT configurations lose a bit of efficiency within the MPPT, but the configuration as a whole gains even more power back - by leaving the PV current low, and greatly reducing wiring losses. PWM utterly can't do that, ever. Virtually all the "power generated at higher Voltage than the Batteries can take" must be left in the sky, via PWM cycling, so you need to keep the voltage as low and near the battery charging voltage as you can manage.