TFITS: Time Series Imputation via Dual-Perspective Fusion of Temporal and Feature Views
Keywords:
Multivariate time series, Missing value, Dual-Perspective, Feature mapsAbstract
In the field of multivariate time series analysis, data completeness plays a crucial role in ensuring the accuracy and reliability of downstream tasks. However, in practical applications, factors such as measurement errors and equipment failures often lead to partial missing values in the data. Therefore, this paper introduces a novel model, focusing on how to efficiently handle missing values in multivariate time series and mitigate the potential negative impact of data incompleteness on subsequent tasks. In previous studies, researchers often adopted a single-perspective approach, separately considering the influence of the temporal dimension and the feature dimension, thereby overlooking the potential of dual-perspective fusion. This paper proposes TFITS, an innovative method for predicting missing values in multivariate time series data. TFITS approaches the problem from both the temporal and feature perspectives of time series data, leveraging the UNetFuse module to fuse feature maps generated from these two perspectives, thereby providing a more comprehensive solution to the missing value problem. Through this approach, TFITS can more effectively address missing values in multivariate time series data. Experimental results demonstrate that the TFITS model not only achieves excellent imputation performance but also exhibits superior and stable performance under different missing rate conditions.
Downloads
References
H. Xu, Z. Liu, H. Wang, C. Li, Y. Niu, W. Wang, and X. Liu, “Denoising diffusion straightforward models for energy conversion monitoring data imputation,” IEEE Transactions on Industrial Informatics, vol. 20, no. 10, pp. 11 987–11 997, 2024, doi:10.1109/TII.2024.3413349.
C. Fu, M. Quintana, Z. Nagy, and C. Miller, “Filling timeseries gaps using image techniques: Multidimensional context autoencoder approach for building energy data imputation,” Applied Thermal Engineering, vol. 236, p. 121545, 2024, doi:https://doi.org/10.1016/j.applthermaleng.2023.121545.
F. Chen, L. Yu, J. Mao, Q. Yang, D. Wang, and C. Yu, “A novel data-characteristic-driven modeling approach for imputing missing value in industrial statistics: A case study of china electricity statistics,” Applied Energy, vol. 373, p. 123854, 2024, doi:https://doi.org/10.1016/j.apenergy.2024.123854.
O. Duarte, J. E. Duarte, and J. Rosero-Garcia, “Data imputation in electricity consumption profiles through shape modeling with autoencoders,” Mathematics, vol. 12, no. 19, 2024, doi:10.3390/math12193004.
M.-C. Cheng, Y.-H. Hsieh, T.-C. Hsu, T.-H. Su, and C. Lin, “Deep sti: Deep stochastic time-series imputation on electronic health records,” in 2024 46th Annual International Conference of the IEEE Engineering in Medicine and Biology Society (EMBC), 2024, pp. 1–4, doi:10.1109/EMBC53108.2024.10782239.
M. Kazijevs and M. D. Samad, “Deep imputation of missing values in time series health data: A review with benchmarking,” Journal of Biomedical Informatics, vol. 144, p. 104440, 2023, doi:https://doi.org/10.1016/j.jbi.2023.104440.
T. Decorte, S. Mortier, J. J. Lembrechts, F. J. R. Meysman, S. Latre,´ E. Mannens, and T. Verdonck, “Missing value imputation of wireless sensor data for environmental monitoring,” Sensors, vol. 24, no. 8, 2024, doi:10.3390/s24082416.
X. Chen, Z. Cheng, H. Cai, N. Saunier, and L. Sun, “Laplacian convolutional representation for traffic time series imputation,” IEEE Transactions on Knowledge and Data Engineering, vol. 36, no. 11, pp. 6490–6502, 2024, doi:10.1109/TKDE.2024.3419698.
J. Wang, W. Du, Y. Yang, L. Qian, W. Cao, K. Zhang, W. Wang, Y. Liang, and Q. Wen, “Deep learning for multivariate time series imputation: A survey,” in Proceedings of the Thirty-Fourth International Joint Conference on Artificial Intelligence, IJCAI-25, J. Kwok, Ed. International Joint Conferences on Artificial Intelligence Organization, 8 2025, pp. 10 696–10 704, survey Track. [Online]. Available: https://doi.org/10.24963/ijcai.2025/1187
X. Jia, X. Dong, M. Chen, and X. Yu, “Missing data imputation for traffic congestion data based on joint matrix factorization,” Knowledge-Based Systems, vol. 225, p. 107114, 2021, doi:https://doi.org/10.1016/j.knosys.2021.107114.
H. Liu and L. Li, “Missing data imputation in gnss monitoring time series using temporal and spatial hankel matrix factorization,” Remote Sensing, vol. 14, no. 6, 2022, doi:10.3390/rs14061500.
J. Yoon, W. R. Zame, and M. van der Schaar, “Estimating missing data in temporal data streams using multi-directional recurrent neural networks,” IEEE Transactions on Biomedical Engineering, vol. 66, no. 5, pp. 1477–1490, 2019, doi:10.1109/TBME.2018.2874712.
W. Cao, D. Wang, J. Li, H. Zhou, L. Li, and Y. Li, “Brits: Bidirectional recurrent imputation for time series,” in Advances in Neural Information Processing Systems, S. Bengio, H. Wallach, H. Larochelle, K. Grauman, N. Cesa-Bianchi, and R. Garnett, Eds., vol. 31. Curran Associates, Inc., 2018, doi:https://doi.org/10.48550/arXiv.1805.10572.
S. Pachal and A. Achar, “Sequence prediction under missing data: An rnn approach without imputation,” in Proceedings of the 31st ACM International Conference on Information & Knowledge Management, ser. CIKM ’22. New York, NY, USA: Association for Computing Machinery, 2022, p. 1605–1614, doi:10.1145/3511808.3557449.
A. Y. Yıldız, E. Koc¸, and A. Koc¸, “Multivariate time series imputation with transformers,” IEEE Signal Processing Letters, vol. 29, pp. 2517–2521, 2022, doi:10.1109/LSP.2022.3224880.
C. Liu, Z. Zhu, W. Hao, and G. Sun, “Heterogeneous multivariate time series imputation by transformer model with missing position encoding,” Expert Systems with Applications, vol. 271, p. 126435, 2025, doi:https://doi.org/10.1016/j.eswa.2025.126435.
Y. Zhang and P. J. Thorburn, “A dual-head attention model for time series data imputation,” Computers and Electronics in Agriculture, vol. 189, p. 106377, 2021, doi:https://doi.org/10.1016/j.compag.2021.106377.
Y. Wang, X. Xu, L. Hu, J. Fan, and M. Han, “A time series continuous missing values imputation method based on generative adversarial networks,” Knowledge-Based Systems, vol. 283, p. 111215, 2024, doi:https://doi.org/10.1016/j.knosys.2023.111215.
R. Qin and Y. Wang, “Imputegan: Generative adversarial network for multivariate time series imputation,” Entropy, vol. 25, no. 1, 2023, doi:10.3390/e25010137.
Y. Zhang, B. Zhou, X. Cai, W. Guo, X. Ding, and X. Yuan, “Missing value imputation in multivariate time series with end-to-end generative adversarial networks,” Information Sciences, vol. 551, pp. 67–82, 2021, doi:https://doi.org/10.1016/j.ins.2020.11.035.
Z. Tang, T. Ji, J. Kang, Y. Huang, and W. Tang, “Learning global and local features of power load series through transformer and 2d-cnn: An image-based multi-step forecasting approach incorporating phase space reconstruction,” Applied Energy, vol. 378, p. 124786, 2025, doi:https://doi.org/10.1016/j.apenergy.2024.124786.
Y. Chen, S. Liu, J. Yang, H. Jing, W. Zhao, and G. Yang, “A joint time-frequency domain transformer for multivariate time series forecasting,” Neural Networks, vol. 176, p. 106334, 2024, doi:https://doi.org/10.1016/j.neunet.2024.106334.
Y. Liu, T. Hu, H. Zhang, H. Wu, S. Wang, L. Ma, and M. Long, “itransformer: Inverted transformers are effective for time series forecasting,” 2024, doi:https://doi.org/10.48550/arXiv.2310.06625.
W. Weng and X. Zhu, “Inet: Convolutional networks for biomedical image segmentation,” IEEE Access, vol. 9, pp. 16 591–16 603, 2021, doi:10.1109/ACCESS.2021.3053408.
V. Ashish, “Attention is all you need,” Advances in neural information processing systems, vol. 30, p. I, 2017, doi:https://doi.org/10.48550/arXiv.1706.03762.
W. Du, D. Cotˆ e, and Y. Liu, “Saits: Self-attention-based imputation for time series,” Expert Systems with Applications, vol. 219, p. 119619, 2023, doi:https://doi.org/10.1016/j.eswa.2023.119619.
K. Yi, Q. Zhang, W. Fan, S. Wang, P. Wang, H. He, N. An, D. Lian, L. Cao, and Z. Niu, “Frequency-domain mlps are more effective learners in time series forecasting,” in Advances in Neural Information Processing Systems, A. Oh, T. Naumann, A. Globerson, K. Saenko, M. Hardt, and S. Levine, Eds., vol. 36. Curran Associates, Inc., 2023, pp. 76 656–76 679. [Online]. Available: https://proceedings.neurips.cc/paper_files/paper/ 2023/file/f1d16af76939f476b5f040fd1398c0a3-Paper-Conference.pdf
M. Liu, A. Zeng, M. Chen, Z. Xu, Q. Lai, L. Ma, and Q. Xu, “Scinet: Time series modeling and forecasting with sample convolution and interaction,” 2022, doi:https://doi.org/10.48550/arXiv.2106.09305.
Y. Liu, S. Wijewickrema, A. Li, C. Bester, S. O’Leary, and J. Bailey, “Time-transformer: Integrating local and global features for better time series generation,” in Proceedings of the 2024 SIAM International Conference on Data Mining (SDM). SIAM, 2024, pp. 325–333, doi:https://doi.org/10.48550/arXiv.2312.11714.
Y. Nie, N. H. Nguyen, P. Sinthong, and J. Kalagnanam, “A time series is worth 64 words: Long-term forecasting with transformers,” arXiv preprint arXiv:2211.14730, 2022, doi:https://doi.org/10.48550/arXiv.2211.14730.
T. Lei, J. Li, and K. Yang, “Time and frequency-domain feature fusion network for multivariate time series classification,” Expert Systems with Applications, vol. 252, p. 124155, 2024, doi:https://doi.org/10.1016/j.eswa.2024.124155.
Y. Luo, S. Zhang, Z. Lyu, and Y. Hu, “Tfdnet: Time–frequency enhanced decomposed network for long-term time series forecasting,” Pattern Recognition, vol. 162, p. 111412, 2025, doi:https://doi.org/10.1016/j.patcog.2025.111412.
C. He, “A hybrid model based on multi-lstm and arima for time series forcasting,” in 2023 8th International Conference on Intelligent Computing and Signal Processing (ICSP), 2023, pp. 612–616, doi:https://doi.org/10.1109/ICSP58490.2023.10248909.
A. Zeng, M. Chen, L. Zhang, and Q. Xu, “Are transformers effective for time series forecasting?” in Proceedings of the AAAI conference on artificial intelligence, vol. 37, no. 9, 2023, pp. 11 121–11 128, doi:10.1609/aaai.v37i9.26317.
Y. Wang, K. Yi, X. Liu, Y. G. Wang, and S. Jin, “ACMP: Allen-cahn message passing with attractive and repulsive forces for graph neural networks,” in The Eleventh International Conference on Learning Representations, 2023. [Online]. Available: https: //openreview.net/forum?id=4fZc 79Lrqs
Z. Zhou, M. M. Rahman Siddiquee, N. Tajbakhsh, and J. Liang, “Unet++: A nested u-net architecture for medical image segmentation,” in Deep Learning in Medical Image Analysis and Multimodal Learning for Clinical Decision Support, D. Stoyanov, Z. Taylor, G. Carneiro, T. Syeda-Mahmood, A. Martel, L. Maier-Hein, J. M. R. Tavares, A. Bradley, J. P. Papa, V. Belagiannis, J. C. Nascimento, Z. Lu, S. Conjeti, M. Moradi, H. Greenspan, and A. Madabhushi, Eds. Cham: Springer International Publishing, 2018, pp. 3–11. [Online]. Available: https://doi.org/10.1007/978-3-030-00889-5 1
Q. Wang, B. Wu, P. Zhu, P. Li, W. Zuo, and Q. Hu, “Eca-net: Efficient channel attention for deep convolutional neural networks,” in Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition (CVPR), June 2020, doi:10.1109/CVPR42600.2020.01155.
A. Paszke, S. Gross, F. Massa, A. Lerer, J. Bradbury, G. Chanan, T. Killeen, Z. Lin, N. Gimelshein, L. Antiga, A. Desmaison, A. Kopf, E. Yang, Z. DeVito, M. Raison, A. Tejani, S. Chilamkurthy, B. Steiner, L. Fang, J. Bai, and S. Chintala, “Pytorch: An imperative style, high-performance deep learning library,” in Advances in Neural Information Processing Systems, H. Wallach, H. Larochelle, A. Beygelzimer, F. d'Alche-Buc, ´ E. Fox, and R. Garnett, Eds., vol. 32. Curran Associates, Inc., 2019. [Online]. Available: https://proceedings.neurips.cc/paper_files/paper/2019/file/bdbca288fee7f92f2bfa9f7012727740-Paper.pdf