GBGNN: Gradient Boosted Graph Neural Networks


Eunjo Jang, Ki Yong Lee, Journal of Information Processing Systems Vol. 20, No. 4, pp. 501-513, Aug. 2024  

https://doi.org/10.3745/JIPS.04.0315
Keywords: Ensemble Method, Gradient Boosting, Graph Neural Network
Fulltext:

Abstract

In recent years, graph neural networks (GNNs) have been extensively used to analyze graph data across various domains because of their powerful capabilities in learning complex graph-structured data. However, recent research has focused on improving the performance of a single GNN with only two or three layers. This is because stacking layers deeply causes the over-smoothing problem of GNNs, which degrades the performance of GNNs significantly. On the other hand, ensemble methods combine individual weak models to obtain better generalization performance. Among them, gradient boosting is a powerful supervised learning algorithm that adds new weak models in the direction of reducing the errors of the previously created weak models. After repeating this process, gradient boosting combines the weak models to produce a strong model with better performance. Until now, most studies on GNNs have focused on improving the performance of a single GNN. In contrast, improving the performance of GNNs using multiple GNNs has not been studied much yet. In this paper, we propose gradient boosted graph neural networks (GBGNN) that combine multiple shallow GNNs with gradient boosting. We use shallow GNNs as weak models and create new weak models using the proposed gradient boosting-based loss function. Our empirical evaluations on three real-world datasets demonstrate that GBGNN performs much better than a single GNN. Specifically, in our experiments using graph convolutional network (GCN) and graph attention network (GAT) as weak models on the Cora dataset, GBGNN achieves performance improvements of 12.3%p and 6.1%p in node classification accuracy compared to a single GCN and a single GAT, respectively.


Statistics
Show / Hide Statistics

Statistics (Cumulative Counts from November 1st, 2017)
Multiple requests among the same browser session are counted as one view.
If you mouse over a chart, the values of data points will be shown.




Cite this article
[APA Style]
Jang, E. & Lee, K. (2024). GBGNN: Gradient Boosted Graph Neural Networks. Journal of Information Processing Systems, 20(4), 501-513. DOI: 10.3745/JIPS.04.0315.

[IEEE Style]
E. Jang and K. Y. Lee, "GBGNN: Gradient Boosted Graph Neural Networks," Journal of Information Processing Systems, vol. 20, no. 4, pp. 501-513, 2024. DOI: 10.3745/JIPS.04.0315.

[ACM Style]
Eunjo Jang and Ki Yong Lee. 2024. GBGNN: Gradient Boosted Graph Neural Networks. Journal of Information Processing Systems, 20, 4, (2024), 501-513. DOI: 10.3745/JIPS.04.0315.