Movatterモバイル変換


[0]ホーム

URL:


Skip to content

Navigation Menu

Search code, repositories, users, issues, pull requests...

Provide feedback

We read every piece of feedback, and take your input very seriously.

Saved searches

Use saved searches to filter your results more quickly

Sign up

[ACM Computing Surveys'23] Implementations or refactor of some temporal link prediction/dynamic link prediction methods and summary of related open resources for survey paper "Temporal Link Prediction: A Unified Framework, Taxonomy, and Review" which has been accepted by ACM Computing Surveys.

NotificationsYou must be signed in to change notification settings

KuroginQin/OpenTLP

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

22 Commits
 
 
 
 
 
 
 
 

Repository files navigation

AwesomeStarsForks

OpenTLP

This repository is the open-source project of a survey paper entitled "Temporal Link Prediction: A Unified Framework, Taxonomy, and Review", which has been accepted byACM Computing Surveys. It refactors or implements some representative temporal link prediction (TLP, a.k.a. dynamic link prediction) methods (especially for those do not provide their source code) based on the unified encoder-decoder framework and terminologies (regarding task settings, taxonomy, etc.) introduced in our survey paper. In addition, this repository also summarizes some other open-source projects regarding TLP. We will keep updating this repository to include some other (SOTA or classic) TLP methods, task settings, dynamic graph datasets, etc.

Note that this repository is not the official implementation of related methods. Some of the implemented TLP approaches also need careful parameter tuning to achieve the best performance on different datasets with different task settings.

Abstract

Dynamic graphs serve as a generic abstraction and description of the evolutionary behaviors of various complex systems (e.g., social networks and communication networks). Temporal link prediction (TLP) is a classic yet challenging inference task on dynamic graphs, which predicts possible future linkage based on historical topology. The predicted future topology can be used to support some advanced applications on real-world systems (e.g., resource pre-allocation) for better system performance. This survey provides a comprehensive review of existing TLP methods. Concretely, we first give the formal problem statements and preliminaries regarding data models, task settings, and learning paradigms that are commonly used in related research. A hierarchical fine-grained taxonomy is further introduced to categorize existing methods in terms of their data models, learning paradigms, and techniques. From a generic perspective, we propose a unified encoder-decoder framework to formulate all the methods reviewed, where different approaches only differ in terms of some components of the framework. Moreover, we envision serving the community with an open-source project OpenTLP that refactors or implements some representative TLP methods using the proposed unified framework and summarizes other public resources. As a conclusion, we finally discuss advanced topics in recent research and highlight possible future directions.

Citing

If you find this project useful for your research, please cite our survey paper.

@article{qin2023temporal,  title={Temporal link prediction: A unified framework, taxonomy, and review},  author={Qin, Meng and Yeung, Dit-Yan},  journal={ACM Computing Surveys},  volume={56},  number={4},  pages={1--40},  year={2023},  publisher={ACM New York, NY, USA}}

If you have any questions regarding this repository, you can contact the author via [mengqin_az@foxmail.com].

Outline

Notations

  • ESSD - Evenly-Sampled Snapshot Sequence Description, a.k.a. Discrete-Time Dynamic Graph (DTDG) in other literature.

  • UESD - Unevenly-Sampled Edge Sequence Description, a.k.a. Continuous-Time Dynamic Graph (CTDG) in other literature.

(We argure that CTDG cannot precisely describe the corresponding data model. Although the time index associated with each edge is defined on a continuous domain in the second data model, the edge sequence used to encode the dynamic topology is still discrete. The term ofcontinuous description may be ambiguous in some cases. Therefore, we use ESSD & UESD instead of DTDG & CTDG.)

  • DI - Direct Inference.

  • OTI - Online Training & Inference.

  • OTOG - Offline Training & Online Generalization.

  • HQ - Able to Derive High-Quality Prediction Results for Weighted TLP.

  • D/L-Dep - the original version of a method cannot support the weighted TLP but it can be easily extended to tackle such a setting by replacing an appropriate decoder or loss.

Implemented or Refactored TLP Methods

We implement some representative TLP methods using MATLAB and PyTorch, with the corresponding code and data put under the direcories './Matlab' and './Python'.

For each method implemented by MATLAB, [method_name].m provides the functions that give the definitions of encoder and decoder, where the loss function is derived during the optimization of encoder. [method_name]_demo.m demonstrates how to use these functions. To run the demonstration code, please first unzip the data.zip in ./Matlab/data.

For each method implemented by PyTorch, a set of classes are used to define the encoder, decoder, and loss function, which are put under the directory './Python/[method_name]'. Furthermore, [method_name].py demonstrates how to use these classes. To run the demonstration code, please first unzip the data.zip in ./Python/data. In particular, these methods (implemented by PyTorch) can be speeded up via GPUs.

Details of the implemented TLP methods are summarized as follows.

MethodsPublication VenuesData ModelsParadigmsLevelAttributesWeighted TLPLanguage
CRJMF[1]CIKM 2011ESSDOTI1StaticAbleMATLAB
GrNMF[2]Physica A 2018ESSDOTI1N/AAbleMATLAB
DeepEye[3]BDMA 2018ESSDOTI1N/AAbleMATLAB
AM-NMF[4]SIGCOMM 2018 NetAI WKSPESSDOTI1N/AAbleMATLAB
TMF[5]WSDM 2017ESSDOTI1N/AAblePython
LIST[6]IJCAI 2017ESSDOTI1N/AAblePython
Dyngraph2vec[7]KBS 2020ESSDOTOG1N/AAblePython
DDNE[8]IEEE Access 2018ESSDOTOG1N/AD/L-DepPython
E-LSTM-D[9]Trans on SMC: Sys 2019ESSDOTOG1N/AAblePython
GCN-GAN[10]InfoCom 2019ESSDOTOG1N/AAble (HQ)Python
NetworkGAN[11]Trans on Cyb 2019ESSDOTOG1N/AAble (HQ)Python
STGSN[12]KBS 2021ESSDOTOG2DynamicAblePython

Other Open Source Projects Regarding TLP

MethodsPublication VenuesData ModelsParadigmsLevelAttributesWeighted TLP
TLSI[13](Code)TKDE 2016ESSDOTI1N/AAble
MLjFE[14](Code)Pattern Recognition 2022ESSDOTI1N/AAble
E-LSTM-D[9](Code)Trans on SMC: Sys 2019ESSDOTOG1N/AAble
EvolveGCN[15](Code)AAAI 2020ESSDOTOG2DynamicD/L-Dep
CTDNE[16](Code)WWW 2018 CompanionUESDOTOG1N/AUnable
M2DNE[17](Code)CIKM 2019UESDOTOG1N/AUnable
DyRep[18](Code)ICLR 2019UESDOTOG2DynamicUnable
TGAT[19](Code)ICLR 2020UESDOTOG2DynamicUnable
CAW[20](Code)ICLR 2021UESDOTOG2DynamicUnable
DySAT[21](Code)WSDM 2020ESSDOTOG2DynamicUnable
TREND[22](Code)WWW 2022UESDOTOG2StaticUnable
DyGNN[23](Code)SIGIR 2020UESDOTOG1N/AUnable
IDEA[24](Code)TKDE 2023ESSDOTOG2StaticAble (HQ)
GSNOP[25](Code)WSDM 2023UESDOTOG2DynamicUnable
TGN[26](Code)ICML 2020 WKSPUESDOTOG2DynamicUnable
MNCI[27](Code)SIGIR 2021UESDOTOG2N/AUnable
TDGNN[28](Code)WWW 2020UESDOTOG1StaticUnable
HTNE[29](Code)KDD 2018UESDOTI1N/AUnable
SGR[30](Code)TKDE 2023ESSDOTOG1N/AAble
EdgeBank[31](Code)NIPS 2022UESDOTOG2DynamicUnable
GraphMixer[32](Code)ICLR 2023UESDOTOG2DynamicUnable
DyGFormer[33](Code)NIPS2023UESDOTOG2DynamicUnable

Public Datasets for TLP

DatasetsScenariosNodesEdgesWei LinksMin Time GranularityData ModelsLevels
Social EvolutionSocial NetworkCell phonesBluetooth signal, calls, or messages between cell phonesNo1 minUESD2
CollegeMsgOnline social networkApp usersMessages sent from a source user to a destination userNo1 secUESD2
Wiki-TalkOnline social networkWikipedia usersRelations that a user edits another user's talk pageNo1 secUESD2
EnronEmail networkEmail usersEmails from a source user to a destination userNo1 secUESD2
Reddit-HyperlinkHyperlink networkSubredditsHyperlinks from one subreddit to anotherNo1 secUESD2
DBLPPaper collaboration networkPaper authorsCollaboration relations between authorsNo1 dayUESD2
AS-733BGP autonomous systems of InternetBGP routersWho-talks-to-whom communication between routersNo1 dayESSD2
Bitcoin-AlphaBitcoin transaction networkBitcoin usersTrust scores between usersYes1 secUESD2
Bitcoin-OTCBitcoin transaction networkBitcoin usersTrust scores between usersYes1 secUESD2
UCSB-MeshWireless mesh networkWireless routersLink quality (in terms of expected transmission time) between routersYes1 minESSD1
NumFabric(Simulated) data center networkHost serversTraffic (in terms of KB) between host serversYes1e-6 secUESD1
UCSD-WTDWiFi mobility networkAccess points/PDA devicesSignal strength (in terms of dBm) between access points and PAD devicesYes1 secUESD2
UNSW-IoTIoT networkIoT devicesTraffic (in terms of KB) between IoT devicesYes1e-6 secUESD2
WIDEInternet backboneHost servers/user devicesTraffic (in terms of KB) between servers/devicesYes1e-6 secUESD2

Other Related Survey Papers

Dynamic Network Embedding (DNE):

(Note that there remain some gaps between the research on DNE and TLP. On the one hand, some classic TLP approaches (e.g., neighbor similarity & graph summarization) are not based on the DNE framework. On the other hand, some DNE models aretask-dependent with specific model architectures and objectives designed only for TLP. Moreover, mosttask-independent DNE techniques can only support simple TLP settings based on some common but naive strategies, e.g., treating the prediction of unweighted links as binary edge classification. Existing survey papers on DNE lack detailed discussions regarding whether and how a DNE method can be used to handle different settings of TLP.)

Temporal Link Prediction (TLP):

Advanced Applications Supported by TLP

Intrusion Detection, Threat Prediction, & Lateral Movement Inference for Cyber Security

Channel Allocation in Wireless IoT Networks

Burst Traffic Detection & Dynamic Routing in Optical Networks

Dynamic Routing & Topology Control in Mobile Ad Hoc Networks

Dynamics Simulation & Conformational Analysis of Molecules

Traffic Demand Prediction in Urban Computing

Log Anomaly Detection in Distributed Computing

Trust Evaluation in BitCoin Transaction

Dynamic Scene Modeling

References

[1] Gao, Sheng, Ludovic Denoyer, and Patrick Gallinari. Temporal Link Prediction by Integrating Content and Structure Information. ACM CIKM, 2011.

[2] Ma, Xiaoke, Penggang Sun, and Yu Wang. Graph Regularized Nonnegative Matrix Factorization for Temporal Link Prediction in Dynamic Networks. Physica A: Statistical Mechanics & Its Applications 496 (2018): 121-136.

[3] Ahmed, Nahla Mohamed, et al. DeepEye: Link Prediction in Dynamic Networks Based on Non-Negative Matrix Factorization. Big Data Mining & Analytics 1.1 (2018): 19-33.

[4] Qin, Meng, et al. Adaptive Multiple Non-Negative Matrix Factorization for Temporal Link Prediction in Dynamic Networks. ACM SIGCOMM Workshop on Network Meets AI & ML, 2018.

[5] Yu, Wenchao, Charu C. Aggarwal, and Wei Wang. Temporally Factorized Network Modeling for Evolutionary Network Analysis. ACM WSDM, 2017.

[6] Yu, Wenchao, et al. Link Prediction with Spatial and Temporal Consistency in Dynamic Networks. IJCAI, 2017.

[7] Goyal, Palash, Sujit Rokka Chhetri, and Arquimedes Canedo. Dyngraph2vec: Capturing Network Dynamics Using Dynamic Graph Representation Learning. Knowledge-Based Systems 187 (2020): 104816.

[8] Li, Taisong, et al. Deep Dynamic Network Embedding for Link Prediction. IEEE Access 6 (2018): 29219-29230.

[9] Chen, Jinyin, et al. E-LSTM-D: A Deep Learning Framework for Dynamic Network Link Prediction. IEEE Transactions on Systems, Man, & Cybernetics: Systems 51.6 (2019): 3699-3712.

[10] Qin, Meng et al. GCN-GAN: A Non-linear Temporal Link Prediction Model for Weighted Dynamic Networks. IEEE INFOCOM, 2019.

[11] Yang, Min, et al. An Advanced Deep Generative Framework for Temporal Link Prediction in Dynamic Networks. IEEE Transactions on Cybernetics 50.12 (2019): 4946-4957.

[12] Min, Shengjie, et al. STGSN—A Spatial–Temporal Graph Neural Network Framework for Time-Evolving Social Networks. Knowledge-Based Systems 214 (2021): 106746.

[13] Zhu, Linhong, et al. Scalable Temporal Latent Space Inference for Link Prediction in Dynamic Social Networks. IEEE Transactions on Knowledge & Data Engineering (TKDE) 28.10 (2016): 2765-2777.

[14] Ma, Xiaoke, et al. Joint Multi-Label Learning and Feature Extraction for Temporal Link Prediction. Pattern Recognition 121 (2022): 108216.

[15] Pareja, Aldo, et al. EvolveGCN: Evolving Graph Convolutional Networks for Dynamic Graphs. AAAI, 2020.

[16] Nguyen, Giang Hoang, et al. Continuous-Time Dynamic Network Embeddings. ACM Companion Proceedings of WWW, 2018.

[17] Lu, Yuanfu, et al. Temporal Network Embedding with Micro-and Macro-Dynamics. ACM CIKM, 2019.

[18] Trivedi, Rakshit, et al. DyRep: Learning Representations over Dynamic Graphs. ICLR, 2019.

[19] Xu, Da, et al. Inductive Representation Learning on Temporal Graphs. ICLR, 2020.

[20] Wang, Yanbang, et al. Inductive Representation Learning in Temporal Networks via Causal Anonymous Walks. ICLR, 2021.

[21] Sankar, Aravind, et al. DySAT: Deep Neural Representation Learning on Dynamic Graphs via Self-Attention Networks. ACM WSDM, 2020.

[22] Wen, Zhihao, and Yuan Fang. TREND: TempoRal Event and Node Dynamics for Graph Representation Learning. ACM WWW, 2022.

[23] Ma, Yao, et al. Streaming Graph Neural Networks. ACM SIGIR, 2020.

[24] Qin, Meng, et al. High-Quality Temporal Link Prediction for Weighted Dynamic Graphs Via Inductive Embedding Aggregation. IEEE TKDE, 2023.

[25] Luo, Linhao, Gholamreza Haffari, and Shirui Pan. Graph Sequential Neural ODE Process for Link Prediction on Dynamic and Sparse Graphs. ACM WSDM, 2023.

[26] Rossi, Emanuele, et al. Temporal Graph Networks for Deep Learning on Dynamic Graphs. ICML Workshop on Graph Representation Learning, 2020.

[27] Liu, Meng, and Yong Liu. Inductive Representation Learning in Temporal Networks via Mining Neighborhood and Community Influences. ACM SIGIR, 2021.

[28] Qu, Liang, et al. Continuous-Time Link Prediction via Temporal Dependent Graph Neural Network. ACM WWW, 2020.

[29] Zuo, Yuan, et al. Embedding Temporal Network via Neighborhood Formation. ACM KDD, 2018.

[30] Yin, Yanting, et al. Super Resolution Graph With Conditional Normalizing Flows for Temporal Link Prediction. IEEE TKDE, 2023.

[31] Poursafaei, Farimah, et al. Towards Better Evaluation for Dynamic Link Prediction. NIPS, 2022.

[32] Cong, Weilin, et al. Do We Really Need Complicated Model Architectures for Temporal Networks? ICLR, 2023.

[33] Yu, Le, et al. Towards Better Dynamic Graph Learning: New Architecture and Unified Library. NIPS, 2023.

About

[ACM Computing Surveys'23] Implementations or refactor of some temporal link prediction/dynamic link prediction methods and summary of related open resources for survey paper "Temporal Link Prediction: A Unified Framework, Taxonomy, and Review" which has been accepted by ACM Computing Surveys.

Topics

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published

[8]ページ先頭

©2009-2025 Movatter.jp