File Download

There are no files associated with this item.

  Links for fulltext
     (May Require Subscription)
Supplementary

Conference Paper: Dynamic Representation Learning for Large-Scale Attributed Networks

TitleDynamic Representation Learning for Large-Scale Attributed Networks
Authors
Keywordsdynamic networks
large-scale attributed networks
network representation learning
sparse random projection
Issue Date2020
Citation
International Conference on Information and Knowledge Management, Proceedings, 2020, p. 1005-1014 How to Cite?
AbstractNetwork embedding, which aims at learning low-dimensional representations of nodes in a network, has drawn much attention for various network mining tasks, ranging from link prediction to node classification. In addition to network topological information, there also exist rich attributes associated with network structure, which exerts large effects on the network formation. Hence, many efforts have been devoted to tackling attributed network embedding tasks. However, they are also limited in their assumption of static network data as they do not account for evolving network structure as well as changes in the associated attributes. Furthermore, scalability is a key factor when performing representation learning on large-scale networks with huge number of nodes and edges. In this work, we address these challenges by developing the DRLAN-Dynamic Representation Learning framework for large-scale Attributed Networks. The DRLAN model generalizes the dynamic attributed network embedding from two perspectives: First, we develop an integrative learning framework with an offline batch embedding module to preserve both the node and attribute proximities, and online network embedding model that recursively updates learned representation vectors. Second, we design a recursive pre-projection mechanism to efficiently model the attribute correlations based on the associative property of matrices. Finally, we perform extensive experiments on three real-world network datasets to show the superiority of DRLAN against state-of-the-art network embedding techniques in terms of both effectiveness and efficiency. The source code is available at: https://github.com/ZhijunLiu95/DRLAN.
Persistent Identifierhttp://hdl.handle.net/10722/308831
ISI Accession Number ID

 

DC FieldValueLanguage
dc.contributor.authorLiu, Zhijun-
dc.contributor.authorHuang, Chao-
dc.contributor.authorYu, Yanwei-
dc.contributor.authorSong, Peng-
dc.contributor.authorFan, Baode-
dc.contributor.authorDong, Junyu-
dc.date.accessioned2021-12-08T07:50:13Z-
dc.date.available2021-12-08T07:50:13Z-
dc.date.issued2020-
dc.identifier.citationInternational Conference on Information and Knowledge Management, Proceedings, 2020, p. 1005-1014-
dc.identifier.urihttp://hdl.handle.net/10722/308831-
dc.description.abstractNetwork embedding, which aims at learning low-dimensional representations of nodes in a network, has drawn much attention for various network mining tasks, ranging from link prediction to node classification. In addition to network topological information, there also exist rich attributes associated with network structure, which exerts large effects on the network formation. Hence, many efforts have been devoted to tackling attributed network embedding tasks. However, they are also limited in their assumption of static network data as they do not account for evolving network structure as well as changes in the associated attributes. Furthermore, scalability is a key factor when performing representation learning on large-scale networks with huge number of nodes and edges. In this work, we address these challenges by developing the DRLAN-Dynamic Representation Learning framework for large-scale Attributed Networks. The DRLAN model generalizes the dynamic attributed network embedding from two perspectives: First, we develop an integrative learning framework with an offline batch embedding module to preserve both the node and attribute proximities, and online network embedding model that recursively updates learned representation vectors. Second, we design a recursive pre-projection mechanism to efficiently model the attribute correlations based on the associative property of matrices. Finally, we perform extensive experiments on three real-world network datasets to show the superiority of DRLAN against state-of-the-art network embedding techniques in terms of both effectiveness and efficiency. The source code is available at: https://github.com/ZhijunLiu95/DRLAN.-
dc.languageeng-
dc.relation.ispartofInternational Conference on Information and Knowledge Management, Proceedings-
dc.subjectdynamic networks-
dc.subjectlarge-scale attributed networks-
dc.subjectnetwork representation learning-
dc.subjectsparse random projection-
dc.titleDynamic Representation Learning for Large-Scale Attributed Networks-
dc.typeConference_Paper-
dc.description.naturelink_to_subscribed_fulltext-
dc.identifier.doi10.1145/3340531.3411945-
dc.identifier.scopuseid_2-s2.0-85095866400-
dc.identifier.spage1005-
dc.identifier.epage1014-
dc.identifier.isiWOS:000749561301001-

Export via OAI-PMH Interface in XML Formats


OR


Export to Other Non-XML Formats