File Download
  Links for fulltext
     (May Require Subscription)
Supplementary

Conference Paper: Quaternion knowledge graph embeddings

TitleQuaternion knowledge graph embeddings
Authors
Issue Date2019
Citation
33rd Annual Conference on Neural Information Processing Systems (NeurIPS 2019), Vancouver, 8-14 December 2019. In Advances in Neural Information Processing Systems, 2019, v. 32 How to Cite?
AbstractIn this work, we move beyond the traditional complex-valued representations, introducing more expressive hypercomplex representations to model entities and relations for knowledge graph embeddings. More specifically, quaternion embeddings, hypercomplex-valued embeddings with three imaginary components, are utilized to represent entities. Relations are modelled as rotations in the quaternion space. The advantages of the proposed approach are: (1) Latent inter-dependencies (between all components) are aptly captured with Hamilton product, encouraging a more compact interaction between entities and relations; (2) Quaternions enable expressive rotation in four-dimensional space and have more degree of freedom than rotation in complex plane; (3) The proposed framework is a generalization of ComplEx on hypercomplex space while offering better geometrical interpretations, concurrently satisfying the key desiderata of relational representation learning (i.e., modeling symmetry, anti-symmetry and inversion). Experimental results demonstrate that our method achieves state-of-the-art performance on four well-established knowledge graph completion benchmarks.
Persistent Identifierhttp://hdl.handle.net/10722/321897
ISSN
2020 SCImago Journal Rankings: 1.399
ISI Accession Number ID

 

DC FieldValueLanguage
dc.contributor.authorZhang, Shuai-
dc.contributor.authorTay, Yi-
dc.contributor.authorYao, Lina-
dc.contributor.authorLiu, Qi-
dc.date.accessioned2022-11-03T02:22:12Z-
dc.date.available2022-11-03T02:22:12Z-
dc.date.issued2019-
dc.identifier.citation33rd Annual Conference on Neural Information Processing Systems (NeurIPS 2019), Vancouver, 8-14 December 2019. In Advances in Neural Information Processing Systems, 2019, v. 32-
dc.identifier.issn1049-5258-
dc.identifier.urihttp://hdl.handle.net/10722/321897-
dc.description.abstractIn this work, we move beyond the traditional complex-valued representations, introducing more expressive hypercomplex representations to model entities and relations for knowledge graph embeddings. More specifically, quaternion embeddings, hypercomplex-valued embeddings with three imaginary components, are utilized to represent entities. Relations are modelled as rotations in the quaternion space. The advantages of the proposed approach are: (1) Latent inter-dependencies (between all components) are aptly captured with Hamilton product, encouraging a more compact interaction between entities and relations; (2) Quaternions enable expressive rotation in four-dimensional space and have more degree of freedom than rotation in complex plane; (3) The proposed framework is a generalization of ComplEx on hypercomplex space while offering better geometrical interpretations, concurrently satisfying the key desiderata of relational representation learning (i.e., modeling symmetry, anti-symmetry and inversion). Experimental results demonstrate that our method achieves state-of-the-art performance on four well-established knowledge graph completion benchmarks.-
dc.languageeng-
dc.relation.ispartofAdvances in Neural Information Processing Systems-
dc.titleQuaternion knowledge graph embeddings-
dc.typeConference_Paper-
dc.description.naturelink_to_OA_fulltext-
dc.identifier.scopuseid_2-s2.0-85090173593-
dc.identifier.volume32-
dc.identifier.isiWOS:000534424302070-

Export via OAI-PMH Interface in XML Formats


OR


Export to Other Non-XML Formats