Total: 1
Learning robust representations from unlabeled time series is crucial, and contrastive learning offers a promising avenue. However, existing contrastive learning approaches for time series often struggle with defining meaningful similarities, tending to overlook inherent physical correlations and diverse, sequence-varying non-stationarity. This limits their representational quality and real-world adaptability. To address these limitations, we introduce AdaTS, a novel adaptive soft contrastive learning strategy. AdaTS offers a compute-efficient solution centered on dynamic instance-wise and temporal assignments to enhance time series representations, specifically by: (i) leveraging Time-Frequency Coherence for robust physics-guided similarity measurement; (ii) preserving relative instance similarities through ordinal consistency learning; and (iii) dynamically adapting to sequence-specific non-stationarity with dynamic temporal assignments. AdaTS is designed as a pluggable module to standard contrastive frameworks, achieving up to 13.7% accuracy improvements across diverse time series datasets and three state-of-the-art contrastive frameworks while enhancing robustness against label scarcity. The code will be publicly available upon acceptance.