Skip to main navigation Skip to search Skip to main content

Graph Sequence Neural Network with an Attention Mechanism for Traffic Speed Prediction

  • Beihang University

Research output: Contribution to journalArticlepeer-review

Abstract

Recent years have witnessed the emerging success of Graph Neural Networks (GNNs) for modeling graphical data. A GNN can model the spatial dependencies of nodes in a graph based on message passing through node aggregation. However, in many application scenarios, these spatial dependencies can change over time, and a basic GNN model cannot capture these changes. In this article, we propose a Graph Sequence neural network with an Attention mechanism (GSeqAtt) for processing graph sequences. More specifically, two attention mechanisms are combined: a horizontal mechanism and a vertical mechanism. GTransformer, which is a horizontal attention mechanism for handling time series, is used to capture the correlations between graphs in the input time sequence. The vertical attention mechanism, a Graph Network (GN) block structure with an attention mechanism (GNAtt), acts within the graph structure in each frame of the time series. Experiments show that our proposed model is able to handle information propagation for graph sequences accurately and efficiently. Moreover, results on real-world data from three road intersections show that our GSeqAtt outperforms state-of-the-art baselines on the traffic speed prediction task.

Original languageEnglish
Article number20
JournalACM Transactions on Intelligent Systems and Technology
Volume13
Issue number2
DOIs
StatePublished - Apr 2022

Keywords

  • Graph neural network
  • self-attention
  • traffic speed prediction

Fingerprint

Dive into the research topics of 'Graph Sequence Neural Network with an Attention Mechanism for Traffic Speed Prediction'. Together they form a unique fingerprint.

Cite this