u/Frank23682

▲ 7 r/reloading+1 crossposts

So generally it's accepted the brass is better compared to steel case ammo when it comes to barrel wear. The reason being that brass cases has high thermal conductivity and greater mass (steel has a much greater strength to weight ratio thus case can be made thinner and lighter) which creates a heat sink effect, pulling heat away from the barrel during the firing sequence. It seems that testing has shown this generally to be true.

However, it seems that when it comes to polymer case ammunition, apparently for the exact same reason that polymer is a even worse heat conductor (like SIGNIFCANTLY worse than steel even, it's essentially a heat insulator,) it actually also apparently also reduces barrel wear with the mechanism being that polymer is an insulator, therefore no heat is able to be transferred to the chamber, more chemical energy is turned into kinetic energy, and the heat goes out of the muzzle or the breech end of the barrel after bullet exits or case ejects along with the gas, ultimately resulting in a cooler barrel and also more thermal efficiency (greater velocity for the same powder charge.) Apparently some tests have also shown THIS to be true.

To me this seems contradictory and I'm struggling to reconcile it. Some ideas:

  1. Brass is actually the worst, and actually that the more insulation the more better. Steel sees more wear because of its association with Eastern Bloc/Chinese designs and the wear comes more from other components such as steel or bimetal bullets, more aggressive or corrosive propellants etc.
  2. Steel case just happens to be at the worst balance where it doesn't really get the efficient insulation effect of polymer but also doesn't get the heat sink effect of brass.

Personally as a reloader I think brass is still the best and results in the least barrel wear. The reasoning being that polymer may keep the chamber cool, but the chamber being heated does NOT really contribute to barrel wear. The biggest factor that causes a barrel to become worn is throat erosion, or the area just ahead of the chamber/case mouth where the rifling is just beginning. We can observe this with borescopes. When barrels have been shot a lot, the area that sees the most wear is the throat where it begins to firecrack. This is generally only seen in the first few inches of the bore from the chamber and essentially not seen at all further down. I think maybe the heat sink effect of brass is able to reduce that peak heat that the throat experiences by spreading the heat more evenly between the chamber and the case and the throat than with polymer where more heat would be transferred directly to the throat.

Does anyone with a physics or firearm engineering background have any better insight into this?

reddit.com
u/Frank23682 — 14 days ago

First of all let me preface by saying that I understand that most ladder testing involving small sample sizes is pretty much bogus. Shooting small group sizes for each powder charge and using it to find nodes is hogwash because unless you have a SIGNIFICANT outlier (like one group is 4 moa while others are 0.5) your confidence intervals are pretty much overlapping so you are essentially reading into noise.

I haven't been handloading for long but I've never done any ladder testing and my testing is really only involves finding my desired velocity and pressure. Accuracy wise I only try to find differences between a few powders and bullet combinations as suggested by u/Trollygag others on here.

With that said, this is completely different from suggesting that optimal powder charge that maximizes accuracy doesn't exist. Intuitively I am convinced that they almost certainly do exist. No evidence to back this up but my intuition is that if we accept that using different powders, which creates different pressure curves, changes accuracy potential, then it follows that changing powder charge, which also creates different pressure curves, should also change accuracy potential. I, and many others, have observed cases of certain powders shooting very poorly for the same bullet as compared to another powder which IS statistically significant and the basis of my thinking here.

My interpretation of the claim that people make about optimal powder charge not existing are saying that they don't MEANINGFULLY exist because the effects of changing powder charge is negligible enough that simply shooting the required amount of shots to create statistically significant difference in confidence intervals will change the properties of the barrel itself through wear which would invalidate your findings anyway.

Ultimately I obviously don't know what the truth is which is why I'm asking here. I do want to get into optimizing the accuracy of my rifle and perhaps ladder testing could be part of that journey.

Traditional ladder testing is out of the pictures because it's clearly statistically useless. But I found that Molon, a guy who does AR15 accuracy testing and posts his data online, uses a technique that narrows the confidence interval through several changes. Basically the gist of it is to load at a certain powder charge interval, then shoot 8+x 5 shot groups. Then overlay every three consecutive group into 15-shot composite groups and compare their mean radius. For example, composite 1 would be group 1, 2, and 3 overlayed, composite 2 would be group 2 and 3 and 4 overlaid, composite 3 group 3 and 4 and 5 overlayed and so on and so forth. Then whichever composite has the lowest mean radius, you pick the middle group of the composite's powder charge.

Do you guys think this technique narrows the interval enough to actually produce meaningful data? Or is it still essentially noise and basically waste of time? Should I spend time and money and barrel life trying it?

reddit.com
u/Frank23682 — 17 days ago