File Download

There are no files associated with this item.

  Links for fulltext
     (May Require Subscription)
Supplementary

Conference Paper: Graph Transformer for Recommendation

TitleGraph Transformer for Recommendation
Authors
KeywordsGraph Transformer
Masked Autoencoder
Recommendation
Issue Date2023
Citation
SIGIR 2023 - Proceedings of the 46th International ACM SIGIR Conference on Research and Development in Information Retrieval, 2023, p. 1680-1689 How to Cite?
AbstractThis paper presents a novel approach to representation learning in recommender systems by integrating generative self-supervised learning with graph transformer architecture. We highlight the importance of high-quality data augmentation with relevant self-supervised pretext tasks for improving performance. Towards this end, we propose a new approach that automates the self-supervision augmentation process through a rationale-aware generative SSL that distills informative user-item interaction patterns. The proposed recommender with Graph TransFormer (GFormer) that offers parameterized collaborative rationale discovery for selective augmentation while preserving global-aware user-item relationships. In GFormer, we allow the rationale-aware SSL to inspire graph collaborative filtering with task-adaptive invariant rationalization in graph transformer. The experimental results reveal that our GFormer has the capability to consistently improve the performance over baselines on different datasets. Several in-depth experiments further investigate the invariant rationale-aware augmentation from various aspects. The source code for this work is publicly available at: https://github.com/HKUDS/GFormer.
Persistent Identifierhttp://hdl.handle.net/10722/355945
ISI Accession Number ID

 

DC FieldValueLanguage
dc.contributor.authorLi, Chaoliu-
dc.contributor.authorXia, Lianghao-
dc.contributor.authorRen, Xubin-
dc.contributor.authorYe, Yaowen-
dc.contributor.authorXu, Yong-
dc.contributor.authorHuang, Chao-
dc.date.accessioned2025-05-19T05:46:49Z-
dc.date.available2025-05-19T05:46:49Z-
dc.date.issued2023-
dc.identifier.citationSIGIR 2023 - Proceedings of the 46th International ACM SIGIR Conference on Research and Development in Information Retrieval, 2023, p. 1680-1689-
dc.identifier.urihttp://hdl.handle.net/10722/355945-
dc.description.abstractThis paper presents a novel approach to representation learning in recommender systems by integrating generative self-supervised learning with graph transformer architecture. We highlight the importance of high-quality data augmentation with relevant self-supervised pretext tasks for improving performance. Towards this end, we propose a new approach that automates the self-supervision augmentation process through a rationale-aware generative SSL that distills informative user-item interaction patterns. The proposed recommender with Graph TransFormer (GFormer) that offers parameterized collaborative rationale discovery for selective augmentation while preserving global-aware user-item relationships. In GFormer, we allow the rationale-aware SSL to inspire graph collaborative filtering with task-adaptive invariant rationalization in graph transformer. The experimental results reveal that our GFormer has the capability to consistently improve the performance over baselines on different datasets. Several in-depth experiments further investigate the invariant rationale-aware augmentation from various aspects. The source code for this work is publicly available at: https://github.com/HKUDS/GFormer.-
dc.languageeng-
dc.relation.ispartofSIGIR 2023 - Proceedings of the 46th International ACM SIGIR Conference on Research and Development in Information Retrieval-
dc.subjectGraph Transformer-
dc.subjectMasked Autoencoder-
dc.subjectRecommendation-
dc.titleGraph Transformer for Recommendation-
dc.typeConference_Paper-
dc.description.naturelink_to_subscribed_fulltext-
dc.identifier.doi10.1145/3539618.3591723-
dc.identifier.scopuseid_2-s2.0-85167676757-
dc.identifier.spage1680-
dc.identifier.epage1689-
dc.identifier.isiWOS:001118084001073-

Export via OAI-PMH Interface in XML Formats


OR


Export to Other Non-XML Formats