I've loaded 50 kWh at 95% efficiency or 47.56 kWh.
How did you measure this?
95% efficiency is too high - it's actually not possible. Tesla clearly documents that charging efficiency on Model 3 is about 88-89%.
Basic Search | Document Index System | US EPA
Anyway, I suspect this is the root of your discrepancy.
The right way to determine how much energy you added to your pack is to take the delta on the SMT numbers for your pack energy content before and after the charge. Since you have SMT this makes it easy!
For your vehicle, the car will display (on the screen in the car) from a 10% to 100% charge that you have added (I'm going to assume 55kWh for the degradation threshold, I think it may be actually slightly lower - someone needs to check the data - I think it has been posted here before) ~55kWh/263mi * (0.9 * 52.9/55*263) = 47.6kWh. (Obviously a lot of numbers cancel here - it simplifies to 0.9*52.9kWh - but I write it this way to show the separate factors.)
This is because the car does not display the actual energy added to the vehicle battery pack - it displays the number of miles added (in your case 90% of ~253mi (407km), which is 228mi (367km) ), multiplied by the charging constant which is (roughly) 55kWh/263mi = 209Wh/mi (130Wh/km).
But on SMT you'll see the delta is 0.955 of that, which is 45.5kWh (you started at 2.4kWh+5kWh ("10%" SOC with 0% being 2.4kWh) and ended at 52.9kWh). Because that's the actual energy added. Also gives roughly 90% charging efficiency if the EVSE (charging station screen) said it added 50kWh.
SMT includes the buffer in NFP.
It's a very common point of confusion, because people naturally think the car is displaying the energy added to the vehicle when it displays that number after a charge event. Unfortunately it is not - it's displaying the result of a calculation based on BMS estimates, which is then scaled up by 4.5% for inexplicable reasons (it boils down to the discrepancy between the charging constant and the actual energy content of each displayed rated mile, which of course differs by 4.5% due to the buffer of 4.5%). (I guess it is scaled
up by 4.7% since 1/0.955 = 1.047 but you get the idea.)
This issue and the similarly inexplicable position of the rated line (which is 5Wh/mi or 3Wh/km higher than the actual charging constant) cause no end of confusion here. That and the degradation threshold (but that's a transient issue). And also the behavior of the Energy Consumption screen confuses people because it suffers from related issues.