Journals - MOST Wiedzy

TASK Quarterly

Optimizing Graph Learning using Hierarchical Graph Adjacency Matrix (HGAM)

Abstract

Graph Neural Networks (GNNs) have been increasingly adopted in modern, large-scale applications, such as social network analysis, recommendation systems, drug discovery, and more. However, the training cost of GNNs can be computationally prohibitive, especially when the graph is large and complex, necessitating the use of a mini-batching approach. In this paper, we propose a novel data structure called the Hierarchical Graph Adjacency Matrix (HGAM) to accelerate GNN training by avoiding redundant computations. With HGAM, we can accelerate the training speed of GNNs by up to four times. Additionally, we propose optimizations on top of HGAM to further enhance performance, achieving an overall speedup of up to 7.72 times for training 3-layer deep GNNs. We evaluated our techniques using three benchmark datasets—Reddit, ogbn-products, and ogbn-mag—and demonstrate that the proposed HGAM technique and related optimizations are advantageous for GNN training across modern hardware platforms.

Keywords:

Data structure optimization, graph neural networks, graph learning, sparse data

Details

Issue
Vol. 28 No. 3 (2024)
Section
Research article
Published
2025-11-19
DOI:
https://doi.org/10.34808/tq2024/28.3/b
Licencja:

Copyright (c) 2025 TASK Quarterly

Creative Commons License

This work is licensed under a Creative Commons Attribution 4.0 International License.

Authors

Download paper