Originally Posted by RobF
I'm wondering if rather than thinking 40 fps should make a difference, we're thinking it should make a difference at the muzzle... where as we're just seeing the difference at the target, not the muzzle. I'd suspect that although the odd shot between best and worst loses 40 fps over 50m, that's 8 fps every 10m, and most likely more will happen in the last 20m than in the first 30. However generally I see group differences in batches indoors at 25m. So perhaps we need to bring the chrono back down the range to other distances to see where the spread in speeds starts to increase to a noticable extent.
Here's another question for you... if we agree that the pellet is arriving at the target 40 fps slower than the fastest one, how much difference in time are we talking, and how much do you you have to bump the BC up so it hits in the same place? I'm not good enough with the maths.
I'm not great with maths either, but I can write a little app to run through every possible variation to come up with an answer (or a number of answers). The two values we don't know are the MV of both shots (BC can be calculated because we already know the downrange velocity). To save me needing 4 squillion squankaflops of processing power I'll assume the MV won't be lower than 700fps or higher than 800fps.
I'll give it a go at lunchtime - actual work is getting the way today