Papers
arxiv:2509.26157

EntroPE: Entropy-Guided Dynamic Patch Encoder for Time Series Forecasting

Published on Sep 30
ยท Submitted by Sachith Abeywickrama on Oct 1
Authors:
,
,
,

Abstract

EntroPE, a temporally informed framework using entropy-guided dynamic patching, enhances time series forecasting by preserving temporal coherence and improving accuracy and efficiency.

AI-generated summary

Transformer-based models have significantly advanced time series forecasting, with patch-based input strategies offering efficiency and improved long-horizon modeling. Yet, existing approaches rely on temporally-agnostic patch construction, where arbitrary starting positions and fixed lengths fracture temporal coherence by splitting natural transitions across boundaries. This naive segmentation often disrupts short-term dependencies and weakens representation learning. In response, we propose EntroPE (Entropy-Guided Dynamic Patch Encoder), a novel, temporally informed framework that dynamically detects transition points via conditional entropy and dynamically places patch boundaries. This preserves temporal structure while retaining the computational benefits of patching. EntroPE consists of two key modules, namely an Entropy-based Dynamic Patcher (EDP) that applies information-theoretic criteria to locate natural temporal shifts and determine patch boundaries, and an Adaptive Patch Encoder (APE) that employs pooling and cross-attention to capture intra-patch dependencies and produce fixed-size latent representations. These embeddings are then processed by a global transformer to model inter-patch dynamics. Experiments across long-term forecasting benchmarks demonstrate that EntroPE improves both accuracy and efficiency, establishing entropy-guided dynamic patching as a promising new paradigm for time series modeling. Code is available at: https://github.com/Sachithx/EntroPE.

Community

Paper author Paper submitter

๐ŸŒŸ EntroPE! Transformer models are widely used for time series forecasting, but they rely on fixed patching strategies. These naive segmentations often break temporal coherence and reduce accuracy.
Unlike previous methods that treat patching as a preprocessing step, EntroPE integrates patching as a core architectural component. It uses entropy to detect natural breakpoints in the data, preserving temporal structure. We further revisit pooling and cross-attention to handle dynamic patches and generate fine-grained embeddings for transformer models.

Across energy, weather, and finance benchmarks, EntroPE consistently delivers more accurate and efficient forecasts.

Sign up or log in to comment

Models citing this paper 0

No model linking this paper

Cite arxiv.org/abs/2509.26157 in a model README.md to link it from this page.

Datasets citing this paper 0

No dataset linking this paper

Cite arxiv.org/abs/2509.26157 in a dataset README.md to link it from this page.

Spaces citing this paper 0

No Space linking this paper

Cite arxiv.org/abs/2509.26157 in a Space README.md to link it from this page.

Collections including this paper 1