Start Trial
Schedule Demo
14-Day Free Trial

Is Your Fancy Tech Even Worth It?

Sam Contorno
Nov 24, 2022

We are currently in the Technological Revolution of Strength & Conditioning. Technology has become more affordable and more accessible than ever, and while this progress is important for our industry, it can also be causing more harm than good if the data is not being collected or utilized properly. The biggest issue that I see when I talk to Strength & Conditioning professionals about Sports Science is the general lack of ability to operate the technology, run a standardized test or clean process and store the data. There is no single reason for these issues: lack of proper onboarding by some tech companies, lack of data science education in Strength & Conditioning curriculums, and lack of time or even care to learn how to correctly process data are all compounding factors.


Sports Science has become synonymous with technology. We have replaced critical thinking with mindless collecting and clicking. We have become more concerned about what we have than what we are actually doing with it. All of the new bells and whistles have led to more confusion about their actual implementation because of the sheer time and expertise required to effectively operate a system and researchers' lack of knowledge of application outside of a biomechanics lab.

I am the first to admit, I get caught up in the dreaded world of fancy Force Plate metrics, and am easily swayed by the lure of a 1080 Sprint Machine. I am also the first to admit that just because we have the technology, it doesn’t mean that there isn’t a better and more effective way to test your athletes without it. A High-Performance Department is built on the Accuracy, Efficiency, and Actionable Insights of the data collected, rather than the amount of money you can spend on shiny new toys. I will never implement a new test or piece of technology until I’ve gone through an extensive vetting process to make sure that it will be successful in our system. It doesn’t matter if that system has six 1080s and four Force Plates, or a stopwatch and a clipboard, the principles remain the same.

Accurate: Is the Data Accurate?

Problems Before Solutions

This seems like a fairly obvious question, but the biggest mistake that teams make when they are buying a piece of technology is simply that they have not yet figured out the problem they are trying to solve. Your problem should dictate your technology, not the other way around. A Strength & Conditioning department decides that they want to know how hard practice was for each athlete, so they buy Catapult. The young, hungry 4th assistant is charged with the task of managing 45 units, reports are quickly and casually made to get data out as soon as possible. Football coaches are asking for speeds. Velocity spikes are being ignored. Units are breaking. Cords are breaking. Next thing you know you’re left with a year of bad data, bad hardware, and a $300,000 bill that could have been saved if you simply decided to ask your players their RPE post-practice.

Now don’t get me wrong, Catapult is an integral part of our department but the problem to solve is: What is the external load demand (yardage) on our football players and how can we use this information to periodize volumes and intensities for the off-season, and monitor external loads in-season? Two very different questions that lead to very different interpretations. You don't buy technology simply because everyone else has it.

New call-to-action


A major part of implementing new technology is education. The Sports Scientist is tasked with vetting new technology to figure out if it’s measuring what it actually says it’s measuring, and if it’s comparable to the gold standard. Again, these seem obvious, but the missing piece of education is how the High-Performance Department and players are educated about the purpose of the data being collected. Every single report I send out has a cheat sheet of metrics and definitions, and we do not implement a piece of technology without presenting to the players and the staff on what the technology is used for, and how the data is going to help to inform decision-making and interventions. Building buy-in is easy. Get the players to understand the how and why of what you are testing, they will start communicating about the data, and the rest of the organization has no choice but to buy in.

Efficient: Is the Data Worth Our Time?


Sports Science is 95% logistics. An assessment means nothing if you can’t test your population and communicate the results in a timely manner. This requires an understanding of the flow of your team lifts, how the technology needs to be set up or distributed, what the athlete has done prior to performing the test, how many athletes need to be tested, how many pieces of equipment you will need to test effectively, and frankly if the test is worth taking the attention off of your other athletes' training. We only have so much time as Strength & Conditioning Coaches to train our athletes, so the testing protocols either have to fit seamlessly into training sessions, or the staff has to be willing to use training hours for assessments. 


Once you have determined whether or not the test can be completed consistently and in the allotted time, the question is then how are you going to process the data? Who is downloading the information? Where are they storing it? What metrics need to be analyzed? What visualization software is being used to create reports? And who is going to fix the issues with the technology when it breaks? Nothing will irritate athletes and strength coaches more than when technology isn’t working, and the buy-in and trust that was initially built with the equipment begins to fade. We know that soft skills are important in the coaching world, but being efficient in data collection requires the most hard skills. If the data takes days or weeks to be processed, the results may no longer be applicable to the situation, and even the most intricate data may not be worth it. The more a Sports Scientist can reduce clicks in their workflows, the more time-efficient a program can be. This is where someone with a skill set in data science is critical to the success of a High-Performance Department.

Actionable: Is the Data Making an Impact?


Sharing and communication of data is the foundation of the High-Performance Department. Testing information cannot be siloed if we truly want to provide the best product for our athletes. Strength & Conditioning, Sports Nutrition, Sports Medicine, Sports Psychology, Sport Coaches all need to have the information easily accessible and easily translatable.

Some of my most utilized reports have 2-3 metrics, with information such as time of practice, and amount of team plays. Overcomplicated reports, difficult to access information, and lack of collaboration between departments makes the data just as ineffective as if it was collected poorly. Assisting other departments with the data they want to collect, and making it available to all of those involved in the health and wellness of the athlete is crucial to creating a comprehensive program. This could be as simple as understanding which methods of communication work best for your staff, and as complex as when everyone from the Strength Interns to the Football Coaches start speaking the same language regarding test results.


For most programs, the hard question is: are you actually willing to use this information to make changes in the training program? This is something that not every staff is willing to say yes to, and this stalls the process before it even begins. You can collect the most intricate sprint data on the market with the ability to create 6 different speed buckets, but if you run one speed day a week, have a team with a low sprint technique training age, 2 coaches, 1 run group, good luck implementing a Combine-style speed program. This piece requires critical reflection on each department to determine what we value in our program, and where we are willing to make changes. Without the implementation of protocols, we will never know if our program was potentially effective in training the adaptation intended. 

To simplify:

  • Are you measuring what you think you are measuring?
  • Can you even logistically complete the testing in a large group setting?
  • Are you even doing anything with the data afterward?

If you can’t answer these three questions with a piece of technology or data, you have to take a hard look at why you are even using it in the first place. We need to remember that fancy technology doesn’t mean results. The Sports Science Arms Race has little to do with how much tech you can post on Instagram, and more to do with how willing you are to ask difficult questions, iterate, innovate, adapt, and experiment to find the best method of increasing athlete health and performance.

New call-to-action

Subscribe by Email

No Comments Yet

Let us know what you think