Dynamic Scale-free Graph Embedding via Self-attention

Authors

  • Dingyang Duan 1)Institute of Information Engineering, Chinese Academy of Sciences, Beijing, China 2) School of Cyber Security, University of Chinese Academy of Sciences, Beijing, China
  • Daren Zha 1) Institute of Information Engineering, Chinese Academy of Sciences, Beijing, China 2) School of Cyber Security, University of Chinese Academy of Sciences, Beijing, China
  • Xiao Yang 3) Aerospace Internet of Things Technology Co., Ltd, Beijing, China 4) China Aerospace Times Electronics Co., Ltd, Beijing, China
  • Jiahui Shen 1) Institute of Information Engineering, Chinese Academy of Sciences, Beijing, China 2) School of Cyber Security, University of Chinese Academy of Sciences, Beijing, China
  • Nan Mu 1) Institute of Information Engineering, Chinese Academy of Sciences, Beijing, China 2) School of Cyber Security, University of Chinese Academy of Sciences, Beijing, China

DOI:

https://doi.org/10.13052/jwe1540-9589.2214

Keywords:

Dynamic graphs, hyperbolic space, self-attention, representation learning

Abstract

Graph neural networks (GNNs) have recently become increasingly popular due to their ability to learn node representations in complex graphs. Existing graph representation learning methods mainly target static graphs in Euclidean space, whereas many graphs in practical applications are dynamic and evolve continuously over time. Recent work has demonstrated that real-world graphs exhibit hierarchical properties. Unfortunately, many methods typically do not account for these latent hierarchical structures. In this work, we propose a dynamic network in hyperbolic space via self-attention, referred to as DynHAT, which leverages both the hyperbolic geometry and attention mechanism to learn node representations. More specifically, DynHAT captures hierarchical information by mapping the structural graph onto hyperbolic space, and time-varying dynamic evolution by flexibly weighting historical representations. Through extensive experiments on three real-world datasets, we show the superiority of our model in embedding dynamic graphs in hyperbolic space and competing methods in a link prediction task. In addition, our results show that embedding dynamic graphs in hyperbolic space has competitive performance when necessitating low dimensions.

Downloads

Download data is not yet available.

Author Biographies

Dingyang Duan, 1)Institute of Information Engineering, Chinese Academy of Sciences, Beijing, China 2) School of Cyber Security, University of Chinese Academy of Sciences, Beijing, China

Dingyang Duan received his Master of Engineering degree from Beijing Jiaotong University in 2016. He is currently a Ph.D. candidate in the School of Cyber Security, University of Chinese Academy of Sciences, Beijing, China. He focuses on machine learning and deep learning. His current research interests are graph neural networks and knowledge graphs.

Daren Zha, 1) Institute of Information Engineering, Chinese Academy of Sciences, Beijing, China 2) School of Cyber Security, University of Chinese Academy of Sciences, Beijing, China

Daren Zha received his bachelor’s degree from the University of Science and Technology of China and his Ph.D. degree from the University of Chinese Academy of Sciences in 2010. He has presided over a number of national key R&D programs on cyberspace security key special projects. He has published more than 20 papers in refereed journals and conferences. His current research interests include intelligent data processing related to cyberspace security, Big Data storage and cryptographic engineering.

Xiao Yang, 3) Aerospace Internet of Things Technology Co., Ltd, Beijing, China 4) China Aerospace Times Electronics Co., Ltd, Beijing, China

Xiao Yang received her Master of Engineering degree from Peking University in 2019. She focuses on machine learning and deep learning. Her current research interests are graph neural networks and computer vision.

Jiahui Shen, 1) Institute of Information Engineering, Chinese Academy of Sciences, Beijing, China 2) School of Cyber Security, University of Chinese Academy of Sciences, Beijing, China

Jiahui Shen received her Ph.D. degree from the University of Chinese Academy of Sciences in 2020. Her current research focuses on machine learning and knowledge graphs.

Nan Mu, 1) Institute of Information Engineering, Chinese Academy of Sciences, Beijing, China 2) School of Cyber Security, University of Chinese Academy of Sciences, Beijing, China

Nan Mu received his Ph.D. degree from the University of Chinese Academy of Sciences in 2021. His research interest includes knowledge Graphs and graph neural networks.

References

Claudio DT Barros, Matheus RF Mendonça, Alex B Vieira, and Artur Ziviani. A survey on embedding dynamic graphs. ACM Computing Surveys (CSUR), 55(1):1–37, 2021.

Michael M Bronstein, Joan Bruna, Yann LeCun, Arthur Szlam, and Pierre Vandergheynst. Geometric deep learning: going beyond euclidean data. IEEE Signal Processing Magazine, 34(4):18–42, 2017.

Arnaud Casteigts, Paola Flocchini, Walter Quattrociocchi, and Nicola Santoro. Time-varying graphs and dynamic networks. In International Conference on Ad-Hoc Networks and Wireless, pages 346–359. Springer, 2011.

Ines Chami, Zhitao Ying, Christopher Ré, and Jure Leskovec. Hyperbolic graph convolutional neural networks. Advances in neural information processing systems, 32:4868–4879, 2019.

Wei Chen, Wenjie Fang, Guangda Hu, and Michael W Mahoney. On the hyperbolicity of small-world and treelike random graphs. Internet Mathematics, 9(4):434–491, 2013.

Aaron Clauset, Cristopher Moore, and Mark EJ Newman. Hierarchical structure and the prediction of missing links in networks. Nature, 453(7191):98–101, 2008.

Bhuwan Dhingra, Christopher Shallue, Mohammad Norouzi, Andrew Dai, and George Dahl. Embedding text in hyperbolic spaces. In Proceedings of the Twelfth Workshop on Graph-Based Methods for Natural Language Processing (TextGraphs-12), pages 59–69, 2018.

Dingyang Duan, Daren Zha, Xiao Yang, Nan Mu, and Jiahui Shen. Dynamic network embedding in hyperbolic space via self-attention. In Web Engineering: 22nd International Conference, ICWE 2022, Bari, Italy, July 5–8, 2022, Proceedings, page 189–203, Berlin, Heidelberg, 2022. Springer-Verlag.

Octavian Ganea, Gary Bécigneul, and Thomas Hofmann. Hyperbolic neural networks. Advances in neural information processing systems, 31, 2018.

Palash Goyal, Sujit Rokka Chhetri, and Arquimedes Canedo. dyngraph2vec: Capturing network dynamics using dynamic graph representation learning. Knowledge-Based Systems, 187:104816, 2020.

Palash Goyal, Nitin Kamra, Xinran He, and Yan Liu. Dyngem: Deep embedding method for dynamic graphs. arXiv preprint arXiv:1805. 11273, 2018.

Aditya Grover and Jure Leskovec. node2vec: Scalable feature learning for networks. In Proceedings of the 22nd ACM SIGKDD international conference on Knowledge discovery and data mining, pages 855–864, 2016.

Caglar Gulcehre, Misha Denil, Mateusz Malinowski, Ali Razavi, Razvan Pascanu, Karl Moritz Hermann, Peter Battaglia, Victor Bapst, David Raposo, Adam Santoro, et al. Hyperbolic attention networks. In International Conference on Learning Representations, 2018.

F Maxwell Harper and Joseph A Konstan. The movielens datasets: History and context. Acm transactions on interactive intelligent systems (tiis), 5(4):1–19, 2015.

Valentin Khrulkov, Leyla Mirvakhabova, Evgeniya Ustinova, Ivan Oseledets, and Victor Lempitsky. Hyperbolic image embeddings. In Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition, pages 6418–6428, 2020.

Dmitri Krioukov, Fragkiskos Papadopoulos, Maksim Kitsak, Amin Vahdat, and Marián Boguná. Hyperbolic geometry of complex networks. Physical Review E, 82(3):036106, 2010.

Jundong Li, Harsh Dani, Xia Hu, Jiliang Tang, Yi Chang, and Huan Liu. Attributed network embedding for learning in a dynamic environment. In Proceedings of the 2017 ACM on Conference on Information and Knowledge Management, pages 387–396, 2017.

Nathan Linial, Eran London, and Yuri Rabinovich. The geometry of graphs and some of its algorithmic applications. Combinatorica, 15(2):215–245, 1995.

Qi Liu, Maximilian Nickel, and Douwe Kiela. Hyperbolic graph neural networks. Advances in neural information processing systems, 32, 2019.

Maximillian Nickel and Douwe Kiela. Poincaré embeddings for learning hierarchical representations. Advances in neural information processing systems, 30, 2017.

Zhaoyang Niu, Guoqiang Zhong, and Hui Yu. A review on the attention mechanism of deep learning. Neurocomputing, 452:48–62, 2021.

Pietro Panzarasa, Tore Opsahl, and Kathleen M Carley. Patterns and dynamics of users’ behavior and interaction: Network analysis of an online community. Journal of the American Society for Information Science and Technology, 60(5):911–932, 2009.

Aldo Pareja, Giacomo Domeniconi, Jie Chen, Tengfei Ma, Toyotaro Suzumura, Hiroki Kanezashi, Tim Kaler, Tao Schardl, and Charles Leiserson. Evolvegcn: Evolving graph convolutional networks for dynamic graphs. In Proceedings of the AAAI Conference on Artificial Intelligence, volume 34, pages 5363–5370, 2020.

Wei Peng, Tuomas Varanka, Abdelrahman Mostafa, Henglin Shi, and Guoying Zhao. Hyperbolic deep neural networks: A survey. IEEE Transactions on Pattern Analysis and Machine Intelligence, 2021.

Carey E. Priebe, Youngser Park, David J. Marchette, John M. Conroy, John Grothendieck, and Allen L. Gorin. Statistical inference on attributed random graphs: Fusion of graph features and content: An experiment on time series of enron graphs. Computational Statistics & Data Analysis, 54(7):1766–1776, 2010.

Erzsébet Ravasz and Albert-László Barabási. Hierarchical organization in complex networks. Physical review E, 67(2):026112, 2003.

Frederic Sala, Chris De Sa, Albert Gu, and Christopher Ré. Representation tradeoffs for hyperbolic embeddings. In International conference on machine learning, pages 4460–4469. PMLR, 2018.

Aravind Sankar, Yanhong Wu, Liang Gou, Wei Zhang, and Hao Yang. Dysat: Deep neural representation learning on dynamic graphs via self-attention networks. In Proceedings of the 13th International Conference on Web Search and Data Mining, pages 519–527, 2020.

Rik Sarkar. Low distortion delaunay embedding of trees in hyperbolic plane. In International Symposium on Graph Drawing, pages 355–366. Springer, 2011.

Li Sun, Zhongbao Zhang, Jiawei Zhang, Feiyang Wang, Hao Peng, Sen Su, and S Yu Philip. Hyperbolic variational graph neural network for modeling dynamic graphs. In Proceedings of the AAAI Conference on Artificial Intelligence, volume 35, pages 4375–4383, 2021.

Yi Tay, Luu Anh Tuan, and Siu Cheung Hui. Hyperbolic representation learning for fast and efficient neural question answering. In Proceedings of the Eleventh ACM International Conference on Web Search and Data Mining, pages 583–591, 2018.

Rakshit Trivedi, Mehrdad Farajtabar, Prasenjeet Biswal, and Hongyuan Zha. Dyrep: Learning representations over dynamic graphs. In International conference on learning representations, 2019.

Ashish Vaswani, Noam Shazeer, Niki Parmar, Jakob Uszkoreit, Llion Jones, Aidan N Gomez, Łukasz Kaiser, and Illia Polosukhin. Attention is all you need. In Advances in neural information processing systems, pages 5998–6008, 2017.

Petar Veličković, Guillem Cucurull, Arantxa Casanova, Adriana Romero, Pietro Lio, and Yoshua Bengio. Graph attention networks. In International Conference on Learning Representations, volume 1050, page 4, 2018.

Lili Wang, Chenghan Huang, Weicheng Ma, Ruibo Liu, and Soroush Vosoughi. Hyperbolic node embedding for temporal networks. Data Mining and Knowledge Discovery, 35(5):1906–1940, 2021.

Guotong Xue, Ming Zhong, Jianxin Li, Jia Chen, Chengshuai Zhai, and Ruochen Kong. Dynamic network embedding survey. Neurocomputing, 472:212–223, 2022.

Hansheng Xue, Luwei Yang, Wen Jiang, Yi Wei, Yi Hu, and Yu Lin. Modeling dynamic heterogeneous network for link prediction using hierarchical attention with temporal rnn. In Machine Learning and Knowledge Discovery in Databases: European Conference, ECML PKDD 2020, Ghent, Belgium, September 14–18, 2020, Proceedings, Part I, page 282–298, Berlin, Heidelberg, 2020. Springer-Verlag.

Menglin Yang, Ziqiao Meng, and Irwin King. Featurenorm: L2 feature normalization for dynamic graph embedding. In 2020 IEEE International Conference on Data Mining (ICDM), pages 731–740. IEEE, 2020.

Menglin Yang, Min Zhou, Marcus Kalander, Zengfeng Huang, and Irwin King. Discrete-time temporal network embedding via implicit hierarchical learning in hyperbolic space. In Proceedings of the 27th ACM SIGKDD Conference on Knowledge Discovery & Data Mining, pages 1975–1985, 2021.

Menglin Yang, Min Zhou, Zhihao Li, Jiahong Liu, Lujia Pan, Hui Xiong, and Irwin King. Hyperbolic graph neural networks: A review of methods and applications. arXiv e-prints, pages arXiv–2202, 2022.

Daokun Zhang, Jie Yin, Xingquan Zhu, and Chengqi Zhang. Network representation learning: A survey. IEEE transactions on Big Data, 6(1):3–28, 2018.

Sixiao Zhang, Hongxu Chen, Xiao Ming, Lizhen Cui, Hongzhi Yin, and Guandong Xu. Where are we in embedding spaces? In Association for Computing Machinery, KDD ’21, page 2223–2231, New York, NY, USA, 2021.

Yiding Zhang, Xiao Wang, Chuan Shi, Xunqiang Jiang, and Yanfang Fanny Ye. Hyperbolic graph attention network. IEEE Transactions on Big Data, 2021.

Lekui Zhou, Yang Yang, Xiang Ren, Fei Wu, and Yueting Zhuang. Dynamic network embedding by modeling triadic closure process. In Proceedings of the AAAI Conference on Artificial Intelligence, volume 32, 2018.

Downloads

Published

2023-04-20

How to Cite

Duan, D. ., Zha, D. ., Yang, X. ., Shen, J. ., & Mu, N. . (2023). Dynamic Scale-free Graph Embedding via Self-attention. Journal of Web Engineering, 22(01), 55–78. https://doi.org/10.13052/jwe1540-9589.2214

Issue

Section

ICWE2022