File Download

There are no files associated with this item.

  Links for fulltext
     (May Require Subscription)
Supplementary

Conference Paper: EEGNN: Edge Enhanced Graph Neural Network with a Bayesian Nonparametric Graph Model

TitleEEGNN: Edge Enhanced Graph Neural Network with a Bayesian Nonparametric Graph Model
Authors
Issue Date2023
Citation
Proceedings of Machine Learning Research, 2023, v. 206, p. 2132-2146 How to Cite?
AbstractTraining deep graph neural networks (GNNs) poses a challenging task, as the performance of GNNs may suffer from the number of hidden message-passing layers. The literature has focused on the proposals of over-smoothing and under-reaching to explain the performance deterioration of deep GNNs. In this paper, we propose a new explanation for such deteriorated performance phenomenon, mis-simplification, that is, mistakenly simplifying graphs by preventing self-loops and forcing edges to be unweighted. We show that such simplifying can reduce the potential of message-passing layers to capture the structural information of graphs. In view of this, we propose a new framework, edge enhanced graph neural network (EEGNN). EEGNN uses the structural information extracted from the proposed Dirichlet mixture Poisson graph model (DMPGM), a Bayesian nonparametric model for graphs, to improve the performance of various deep message-passing GNNs. We propose a Markov chain Monte Carlo inference framework for DMPGM. Experiments over different datasets show that our method achieves considerable performance increase compared to baselines.
Persistent Identifierhttp://hdl.handle.net/10722/336388

 

DC FieldValueLanguage
dc.contributor.authorLiu, Yirui-
dc.contributor.authorQiao, Xinghao-
dc.contributor.authorWang, Liying-
dc.contributor.authorLam, Jessica-
dc.date.accessioned2024-01-15T08:26:25Z-
dc.date.available2024-01-15T08:26:25Z-
dc.date.issued2023-
dc.identifier.citationProceedings of Machine Learning Research, 2023, v. 206, p. 2132-2146-
dc.identifier.urihttp://hdl.handle.net/10722/336388-
dc.description.abstractTraining deep graph neural networks (GNNs) poses a challenging task, as the performance of GNNs may suffer from the number of hidden message-passing layers. The literature has focused on the proposals of over-smoothing and under-reaching to explain the performance deterioration of deep GNNs. In this paper, we propose a new explanation for such deteriorated performance phenomenon, mis-simplification, that is, mistakenly simplifying graphs by preventing self-loops and forcing edges to be unweighted. We show that such simplifying can reduce the potential of message-passing layers to capture the structural information of graphs. In view of this, we propose a new framework, edge enhanced graph neural network (EEGNN). EEGNN uses the structural information extracted from the proposed Dirichlet mixture Poisson graph model (DMPGM), a Bayesian nonparametric model for graphs, to improve the performance of various deep message-passing GNNs. We propose a Markov chain Monte Carlo inference framework for DMPGM. Experiments over different datasets show that our method achieves considerable performance increase compared to baselines.-
dc.languageeng-
dc.relation.ispartofProceedings of Machine Learning Research-
dc.titleEEGNN: Edge Enhanced Graph Neural Network with a Bayesian Nonparametric Graph Model-
dc.typeConference_Paper-
dc.description.naturelink_to_subscribed_fulltext-
dc.identifier.scopuseid_2-s2.0-85165165285-
dc.identifier.volume206-
dc.identifier.spage2132-
dc.identifier.epage2146-
dc.identifier.eissn2640-3498-

Export via OAI-PMH Interface in XML Formats


OR


Export to Other Non-XML Formats