Finding, Evaluating and Applying Baseball Research (Part 4): Guest Blog By Dr. Ed Fehringer

in blog
Comments are off for this post.

Wrapping up our 4-part series on finding, evaluation and applying baseball research, we’ll now share some final thoughts from Dr. Ed Fehringer.  Dr. Fehringer is a highly regarded orthopedic surgeon from Omaha, Nebraska.  He has been involved in a lot of clinical research over his career.  When he graciously agreed to write this blog, we brainstormed and came up with 11 questions we thought player sand coaches might want answered.

1. Why do we need good research in baseball?
2. Why is it important for a coach/instructor to be able to review, understand and critique research.
3. How do I find relevant research (online search hacks)?
4. What are the parts/design of a typical study?
5. How do I evaluate the quality of a study?
6. What are the dangers of only reading the abstract.
7. What are the typical statistical tools used and what do they mean?
8. How do I evaluate the authors’ methods, discussion and conclusions?
9. What are the most common pitfalls in reading research?
10. How do I begin to apply the results of research to help my players/team?
11. What are the possible consequences of not learning to search out and evaluate quality research?

In Part 1, Dr. Fehringer answered question 1, 2 and 3. He covered questions 4-6 in Part 2 and our FBR R&D Coordinator, Jordan Rassmann handled question 7 in Part 3.

Today we’ll enjoy Dr. Fehringer’s final comments as he answers questions 8-11.

  1. How do I evaluate the authors’ methods, discussion and conclusions?

As important as anything is maintaining your common sense.  Overall, I am looking for clarity and this starts in the Introduction. Does the author present a clear and compelling case for a question posed? Is the history presented excessive or is it sufficiently succinct while building a case for a question they’d like to answer? Is there a hypothesis and/or purpose statement in the Introduction? You know as coaches whether the question posed is relevant to you, baseball, your team, or potentially some other audience. Do the authors appear to be trying to do too much? Remember, baby steps. Research that attempts to “do it all” is doomed to fail. Keeping it simple by trying to answer one question is generally best. If another question or two in answered or partially answered, that’s a bonus. Keep in mind that sometimes there simply are not enough subjects to truly be able to say anything other than they performed some research. This is due to something called ‘statistical power,’ which will be described in the stats question/answer.

      For the Methods section, the authors described what was actually done in terms of performing the research. As coaches, you can be especially critical of this area as you live it on a daily basis, baseball that is. Sometimes research gets headed down a path that seems impractical or doesn’t all fit together. If drills noted seem impractical or irrelevant or if angles measured seem impossible without incredibly expensive equipment, take note. Do the authors appear to be keeping things simple? Simple is almost always better because there are an infinite number of variables in human beings alone. Keeping variables reduced to as few as possible is important.


      Results can get a little confusing if one gets tripped up with the statistics. (This is where Jordan can also help).   The most meaningful results should be presented first and the less meaningful after that. Getting caught up in statistical values can be dangerous if the study does not make sense on the surface. We often speak of studies being “statistically significant but clinically insignificant.” So, while a p-value may be less than 0.05, it does not necessarily suggest it is a clinically significant finding. It all has to fit together, to make sense.

In the Discussion, the authors talk about their results/findings and compare them to prior authors’ study findings and distinguish why their respective finding(s) is significant. They will discuss their study’s limitations, which is very important; the more honest and accurate they are, the more I trust them. Every study has limitations. Most will also attempt to downplay their limitations and/or discuss what they did to combat the limitations.

      In the Conclusion, the authors will make conclusions about their research. It’s important to be critical of this. Many many times authors will make conclusions that are unsupported by their data. Most reviewers and editors of journals will not allow this to happen. Moreover, some will make conclusions about another’s research and broadcast them despite those conclusions being inaccurate and/or unsupported. Again, there are often so darned many variables. What can we take away from the research? What is that nugget, that kernel of truth?    

  1. What are the most common pitfalls in reading research?

Probably one of the greatest pitfalls in reading research is making inferential leaps about a subject or research that simply is not supported by the data. Keep in mind there are biases in the performing of research and there are biases in the interpreting of research. Another pitfall includes failing to follow a disciplined analysis of each of the respective parts of the study. Another pitfall is making the judgment that one is not qualified to analyze the research. I would beg to differ. Jump right in. Read, read, and read some more. Education is critical. Bounce things and ideas off people that you trust that are knowledgeable. Research is a gift to everyone, not just researchers.

  1. How do I begin to apply the results of research to help my players/team?

In my opinion, getting the feedback of some of your trusted advisors, coaches, & friends regarding some research is important. I would not recommend wholesale changes based upon one or two or three studies. Whole bodies of work are important. Careful tinkering and close observation are important. What works for one may not work for someone else or in another area of the country. And remember, experience and wisdom still remain terrific teachers. While they are not perfect either, the blend of all of the above is important.

  1. What are the possible consequences of not learning to search out and evaluate quality research?

Certainly there may be no harm in avoiding the research arena. However, in today’s day and age, continuing education is critical, especially as information seems to be coming at us every second. I don’t believe baseball research is going away anytime soon. With the injury rates as they are, everyone has or wants a solution. The fact is the solution will be multi-faceted. There will never be a silver bullet. Again….variables. But if one is not continuing to educate oneself on how to evaluate new research, it can become more and more difficult to discriminate between good and poor information. Jumping in, reading what’s available, getting feedback from trusted sources, and becoming part of the process is what will make our players and use better and safer.

Thank you for inviting me to participate in this project.  And, thank you and our friends at The Texas Baseball Ranch®  for what you have taught and are teaching ALL OF US!  While it never feels like it’s quickly enough, it’s happening. It’ll take 5-10 more years for the medical community to get on board. Private industry will force it. It’s frustrating and yet it’s incredibly exciting. I am thrilled to be associated with the drivers of change. 

Thank you, Ed.  You have made us all a little better this week. I look forward to working with you in the future.


Randy Sullivan, MPT, CSCS

Share this article