File Download
There are no files associated with this item.
Links for fulltext
(May Require Subscription)
- Publisher Website: 10.1145/3701551.3703536
- Scopus: eid_2-s2.0-105001668770
- WOS: WOS:001476971200059
Supplementary
- Citations:
- Appears in Collections:
Conference Paper: LightGNN: Simple Graph Neural Network for Recommendation
| Title | LightGNN: Simple Graph Neural Network for Recommendation |
|---|---|
| Authors | |
| Keywords | Graph Learning Knowledge Distillation Recommendation |
| Issue Date | 2025 |
| Citation | WSDM 2025 - Proceedings of the 18th ACM International Conference on Web Search and Data Mining, 2025, p. 549-558 How to Cite? |
| Abstract | Graph neural networks (GNNs) have demonstrated superior performance in collaborative recommendation through their ability to conduct high-order representation smoothing, effectively capturing structural information within users' interaction patterns. However, existing GNN paradigms face significant challenges in scalability and robustness when handling large-scale, noisy real-world datasets. To address these challenges, we present LightGNN, a lightweight and distillation-based GNN pruning framework designed to substantially reduce model complexity while preserving essential collaboration modeling capabilities. Our LightGNN framework introduces a computationally efficient pruning module that adaptively identifies and removes adverse edges and embedding entries for model compression. The framework is guided by a resource-friendly hierarchical knowledge distillation objective, whose intermediate layer augments the observed graph to maintain performance, particularly in high-rate compression scenarios. Extensive experiments on public datasets demonstrate LightGNN's effectiveness, significantly improving both computational efficiency and recommendation accuracy. Notably, LightGNN achieves an 80% reduction in edge count and 90% reduction in embedding entries while maintaining performance comparable to more complex state-of-the-art baselines. The implementation of our LightGNN model is available at the github repository: https://github.com/HKUDS/LightGNN. |
| Persistent Identifier | http://hdl.handle.net/10722/355856 |
| ISI Accession Number ID |
| DC Field | Value | Language |
|---|---|---|
| dc.contributor.author | Chen, Guoxuan | - |
| dc.contributor.author | Xia, Lianghao | - |
| dc.contributor.author | Huang, Chao | - |
| dc.date.accessioned | 2025-05-19T05:45:46Z | - |
| dc.date.available | 2025-05-19T05:45:46Z | - |
| dc.date.issued | 2025 | - |
| dc.identifier.citation | WSDM 2025 - Proceedings of the 18th ACM International Conference on Web Search and Data Mining, 2025, p. 549-558 | - |
| dc.identifier.uri | http://hdl.handle.net/10722/355856 | - |
| dc.description.abstract | Graph neural networks (GNNs) have demonstrated superior performance in collaborative recommendation through their ability to conduct high-order representation smoothing, effectively capturing structural information within users' interaction patterns. However, existing GNN paradigms face significant challenges in scalability and robustness when handling large-scale, noisy real-world datasets. To address these challenges, we present LightGNN, a lightweight and distillation-based GNN pruning framework designed to substantially reduce model complexity while preserving essential collaboration modeling capabilities. Our LightGNN framework introduces a computationally efficient pruning module that adaptively identifies and removes adverse edges and embedding entries for model compression. The framework is guided by a resource-friendly hierarchical knowledge distillation objective, whose intermediate layer augments the observed graph to maintain performance, particularly in high-rate compression scenarios. Extensive experiments on public datasets demonstrate LightGNN's effectiveness, significantly improving both computational efficiency and recommendation accuracy. Notably, LightGNN achieves an 80% reduction in edge count and 90% reduction in embedding entries while maintaining performance comparable to more complex state-of-the-art baselines. The implementation of our LightGNN model is available at the github repository: https://github.com/HKUDS/LightGNN. | - |
| dc.language | eng | - |
| dc.relation.ispartof | WSDM 2025 - Proceedings of the 18th ACM International Conference on Web Search and Data Mining | - |
| dc.subject | Graph Learning | - |
| dc.subject | Knowledge Distillation | - |
| dc.subject | Recommendation | - |
| dc.title | LightGNN: Simple Graph Neural Network for Recommendation | - |
| dc.type | Conference_Paper | - |
| dc.description.nature | link_to_subscribed_fulltext | - |
| dc.identifier.doi | 10.1145/3701551.3703536 | - |
| dc.identifier.scopus | eid_2-s2.0-105001668770 | - |
| dc.identifier.spage | 549 | - |
| dc.identifier.epage | 558 | - |
| dc.identifier.isi | WOS:001476971200059 | - |
