This weekend at World Pitching Congress, I heard people talk about creating more efficient arm action - basically developing the ability to throw at the same or greater velocities with less Nm of stress on the elbow. The assumption is that fewer Nm of torque on the arm at a given velocity is less likely to lead to injury.
At first glance, this makes total sense. If the elbow is experiencing less torque per MPH, it should be better able to handle greater velocities. If we extrapolate this out to the extreme, it checks out. If someone could throw a baseball with .01Nm/MPH, it would be extremely unlikely they'd get hurt at any realistic velocity range. If another pitcher threw with 10Nm/MPH, it would be unlikely they could throw hard enough to reach the plate without their UCL snapping in half.
But the ranges we're usually working with present a gray area in the middle. And when we look at this problem we're not incorporating the actual efficiencies of the system. We're not factoring in the attenuating effects of the muscles and fascia around the ligament.
Dr. James Buffi started work on an inverse dynamics project years ago (that he may or may not have completed with the Dodgers). The idea was to see how much the different muscles in and around the forearm contribute to supporting the elbow as it reaches peak torque. The goal would be to learn how the tissues actually function as a unit to prevent ligament tear. Then the plan would be to figure out how to train these tissues to better dissipate the forces around the elbow joint.
So is it possible that a pattern that creates a lower torque/velocity ratio actual increases stress on the UCL itself? Isn't it possible that with change to the movement pattern the contributions of the surrounding tissue might go down at a rate faster than the stress created?
I don't have an answer, but it's something I've been thinking about.
I'd love to hear your thoughts.
top of page
Search
bottom of page
Comments