skip to main content
Language:
Search Limited to: Search Limited to: Resource type Show Results with: Show Results with: Search type Index

A multi-head attention-based transformer model for traffic flow forecasting with a comparative analysis to recurrent neural networks

ISSN: 0957-4174 ;EISSN: 1873-6793 ;DOI: 10.1016/j.eswa.2022.117275

Digital Resources/Online E-Resources

Citations Cited by
  • Title:
    A multi-head attention-based transformer model for traffic flow forecasting with a comparative analysis to recurrent neural networks
  • Author: Selim Reza ; Marta Campos Ferreira ; José Joaquim M. Machado ; João Manuel R. S. Tavares
  • Subjects: Ciências da engenharia e tecnologias ; Ciências Tecnológicas ; Engineering and technology ; Technological sciences
  • Description: Traffic flow forecasting is an essential component of an intelligent transportation system to mitigate congestion. Recurrent neural networks, particularly gated recurrent units and long short-term memory, have been the stateof-the-art traffic flow forecasting models for the last few years. However, a more sophisticated and resilient model is necessary to effectively acquire long-range correlations in the time-series data sequence under analysis. The dominant performance of transformers by overcoming the drawbacks of recurrent neural networks in natural language processing might tackle this need and lead to successful time-series forecasting. This article presents a multi-head attention based transformer model for traffic flow forecasting with a comparative analysis between a gated recurrent unit and a long-short term memory-based model on PeMS dataset in this context. The model uses 5 heads with 5 identical layers of encoder and decoder and relies on Square Subsequent Masking techniques. The results demonstrate the promising performance of the transform-based model in predicting long-term traffic flow patterns effectively after feeding it with substantial amount of data. It also demonstrates its worthiness by increasing the mean squared errors and mean absolute percentage errors by (1.25 - 47.8)% and (32.4 - 83.8)%, respectively, concerning the current baselines.
  • Creation Date: 2022-09
  • Language: English
  • Identifier: ISSN: 0957-4174
    EISSN: 1873-6793
    DOI: 10.1016/j.eswa.2022.117275
  • Source: Universidade do Porto Institutional Repository Open Access

Searching Remote Databases, Please Wait