TY - GEN
T1 - An Efficient Decomposition-Driven Linear Framework for Long-Term Time-Series Forecasting
AU - Chen, Zhihong
AU - Zhao, Yu
AU - Zou, Tao
AU - Ye, Junchen
AU - Du, Bowen
AU - Huang, Runhe
N1 - Publisher Copyright:
© 2025 IEEE.
PY - 2025
Y1 - 2025
N2 - Recent advances in long-term time-series forecasting have been predominantly driven by Transformer-based architectures, which demonstrate strong performance through powerful attention mechanisms and high model capacity. However, these models often suffer from substantial computational overhead, slow training and inference speeds, and complex feature extraction procedures, making them less suitable for real-time or resource-constrained scenarios. In this work, we revisit the potential of simple yet effective models and propose a novel forecasting framework (iLinear) based on multilayer perceptions, which we refer to as a lightweight linear MLP model. Our model captures long-term temporal dependencies without relying on attention mechanisms by leveraging a hierarchical structure with linear projections and non-linear transformations. This design not only ensures fast iteration and efficient training but also maintains strong representational capacity for complex temporal dynamics. Comprehensive experiments conducted on several widely used time-series benchmark datasets demonstrate that our model consistently outperforms state-of-the-art Transformer-based methods regarding forecasting accuracy, model size, and computational efficiency. These results highlight the feasibility of using simple neural architectures for long-term forecasting and suggest promising directions for future research in efficient time-series modeling.
AB - Recent advances in long-term time-series forecasting have been predominantly driven by Transformer-based architectures, which demonstrate strong performance through powerful attention mechanisms and high model capacity. However, these models often suffer from substantial computational overhead, slow training and inference speeds, and complex feature extraction procedures, making them less suitable for real-time or resource-constrained scenarios. In this work, we revisit the potential of simple yet effective models and propose a novel forecasting framework (iLinear) based on multilayer perceptions, which we refer to as a lightweight linear MLP model. Our model captures long-term temporal dependencies without relying on attention mechanisms by leveraging a hierarchical structure with linear projections and non-linear transformations. This design not only ensures fast iteration and efficient training but also maintains strong representational capacity for complex temporal dynamics. Comprehensive experiments conducted on several widely used time-series benchmark datasets demonstrate that our model consistently outperforms state-of-the-art Transformer-based methods regarding forecasting accuracy, model size, and computational efficiency. These results highlight the feasibility of using simple neural architectures for long-term forecasting and suggest promising directions for future research in efficient time-series modeling.
KW - lightweight model
KW - linear structure
KW - long-term time-series forecasting
UR - https://www.scopus.com/pages/publications/105032649007
U2 - 10.1109/CyberSciTech68397.2025.00056
DO - 10.1109/CyberSciTech68397.2025.00056
M3 - 会议稿件
AN - SCOPUS:105032649007
T3 - Proceedings - 2025 IEEE Cyber Science and Technology Congress, CyberSciTech 2025
SP - 369
EP - 376
BT - Proceedings - 2025 IEEE Cyber Science and Technology Congress, CyberSciTech 2025
PB - Institute of Electrical and Electronics Engineers Inc.
T2 - 2025 IEEE Cyber Science and Technology Congress, CyberSciTech 2025
Y2 - 21 October 2025 through 24 October 2025
ER -