Iβve used them for different phases over the last ~two years with results that werenβt astounding but also not bad in any way, but my personal experience is not swaying our analysis of the effectiveness of a strict 80/20 implementation.
Weβve been analyzing these all along and plan to share data on their effectiveness in the future. Lack of adherence was a major block in getting reliable data last year, but with more data this year we hope to resolve that.
Keeping everything in perspective, there is a VERY small amount of research out there looking into the effectiveness of polarized training, and there isnβt a large and well-structured data pool to be able to effectively analyze its effectiveness, so this effort on our end is unprecedented in many ways. We want to make sure we do it right.
On that note, I wanted to mention something about sharing data on polarized plans. One thing Iβve learned particularly over the past few years as weβve been testing and iterating on AT is how much more complicated data-sharing is than it seems on face value. Itβs very easy to take a cursory glance at data and confuse it with being a comprehensive representation.
Additionally, thereβs plenty of opportunity to slice the data in whichever way confirms a narrative or bias. We have systems of checks and balances with fantastic people in place to make sure we never make that mistake. Iβm sure our approach would infuriate a lot of marketers, but we arenβt everybody else. Integrity is everything to us, because if athletes canβt trust us, thereβs no reason for them to sign up.
So we plan to share data on these plans in the future, but we will do that once we have solid data that is responsibly pulled and interpreted.
Hope that helps provide insight for yβall.