Export Publication

The publication can be exported in the following formats: APA (American Psychological Association) reference format, IEEE (Institute of Electrical and Electronics Engineers) reference format, BibTeX and RIS.

Export Reference (APA)
Dos Santos, R. P. , Matos-Carvalho, J. & Leithardt, V. (2025). Deep learning in time series forecasting with transformer models and RNNs. PeerJ Computer Science. 11
Export Reference (IEEE)
D. S. Pereira et al.,  "Deep learning in time series forecasting with transformer models and RNNs", in PeerJ Computer Science, vol. 11, 2025
Export BibTeX
@article{pereira2025_1764926934220,
	author = "Dos Santos, R. P.  and Matos-Carvalho, J. and Leithardt, V.",
	title = "Deep learning in time series forecasting with transformer models and RNNs",
	journal = "PeerJ Computer Science",
	year = "2025",
	volume = "11",
	number = "",
	doi = "10.7717/peerj-cs.3001",
	url = "https://peerj.com/computer-science/"
}
Export RIS
TY  - JOUR
TI  - Deep learning in time series forecasting with transformer models and RNNs
T2  - PeerJ Computer Science
VL  - 11
AU  - Dos Santos, R. P. 
AU  - Matos-Carvalho, J.
AU  - Leithardt, V.
PY  - 2025
SN  - 2376-5992
DO  - 10.7717/peerj-cs.3001
UR  - https://peerj.com/computer-science/
AB  - Given the increasing need for accurate weather forecasts, the use of neural networks, especially transformer and recurrent neural networks (RNNs), has been highlighted for their ability to capture complex patterns in time series. This study examined 14 neural network models applied to forecast weather variables, evaluated using metrics such as median absolute error (MedianAbsE), mean absolute error (MeanAbsE), maximum absolute error (MaxAbsE), root mean squared percent error (RMSPE), and root mean square error (RMSE). Transformer-based models such as Informer, iTransformer, Former, and patch time series transformer (PatchTST) stood out for their accuracy in capturing long-term patterns, with Informer showing the best performance. In contrast, RNN models such as auto-temporal convolutional networks (TCN) and bidirectional TCN (BiTCN) were better suited to short-term forecasting, despite being more prone to significant errors. Using iTransformer it was possible to achieve a MedianAbsE of 1.21, MeanAbsE of 1.24, MaxAbsE of 2.86, RMSPE de 0.66, and RMSE de 1.43. This study demonstrates the potential of neural networks, especially transformers, to improve accuracy, providing a practical and theoretical basis for selecting the most suitable models for predictive applications.

ER  -