EntroPE: Entropy-Guided Dynamic Patch Encoder for Time Series Forecasting
Abstract
EntroPE, a temporally informed framework using entropy-guided dynamic patching, enhances time series forecasting by preserving temporal coherence and improving accuracy and efficiency.
Transformer-based models have significantly advanced time series forecasting, with patch-based input strategies offering efficiency and improved long-horizon modeling. Yet, existing approaches rely on temporally-agnostic patch construction, where arbitrary starting positions and fixed lengths fracture temporal coherence by splitting natural transitions across boundaries. This naive segmentation often disrupts short-term dependencies and weakens representation learning. In response, we propose EntroPE (Entropy-Guided Dynamic Patch Encoder), a novel, temporally informed framework that dynamically detects transition points via conditional entropy and dynamically places patch boundaries. This preserves temporal structure while retaining the computational benefits of patching. EntroPE consists of two key modules, namely an Entropy-based Dynamic Patcher (EDP) that applies information-theoretic criteria to locate natural temporal shifts and determine patch boundaries, and an Adaptive Patch Encoder (APE) that employs pooling and cross-attention to capture intra-patch dependencies and produce fixed-size latent representations. These embeddings are then processed by a global transformer to model inter-patch dynamics. Experiments across long-term forecasting benchmarks demonstrate that EntroPE improves both accuracy and efficiency, establishing entropy-guided dynamic patching as a promising new paradigm for time series modeling. Code is available at: https://github.com/Sachithx/EntroPE.
Community
๐ EntroPE! Transformer models are widely used for time series forecasting, but they rely on fixed patching strategies. These naive segmentations often break temporal coherence and reduce accuracy.
Unlike previous methods that treat patching as a preprocessing step, EntroPE integrates patching as a core architectural component. It uses entropy to detect natural breakpoints in the data, preserving temporal structure. We further revisit pooling and cross-attention to handle dynamic patches and generate fine-grained embeddings for transformer models.
Across energy, weather, and finance benchmarks, EntroPE consistently delivers more accurate and efficient forecasts.
This is an automated message from the Librarian Bot. I found the following papers similar to this paper.
The following papers were recommended by the Semantic Scholar API
- T3Time: Tri-Modal Time Series Forecasting via Adaptive Multi-Head Alignment and Residual Fusion (2025)
- TimeMosaic: Temporal Heterogeneity Guided Time Series Forecasting via Adaptive Granularity Patch and Segment-wise Decoding (2025)
- Adapting LLMs to Time Series Forecasting via Temporal Heterogeneity Modeling and Semantic Alignment (2025)
- VARMA-Enhanced Transformer for Time Series Forecasting (2025)
- TimeExpert: Boosting Long Time Series Forecasting with Temporal Mix of Experts (2025)
- Super-Linear: A Lightweight Pretrained Mixture of Linear Experts for Time Series Forecasting (2025)
- Conv-like Scale-Fusion Time Series Transformer: A Multi-Scale Representation for Variable-Length Long Time Series (2025)
Please give a thumbs up to this comment if you found it helpful!
If you want recommendations for any Paper on Hugging Face checkout this Space
You can directly ask Librarian Bot for paper recommendations by tagging it in a comment:
@librarian-bot
recommend
Models citing this paper 0
No model linking this paper
Datasets citing this paper 0
No dataset linking this paper
Spaces citing this paper 0
No Space linking this paper