288@2022@IJCAI

Total: 1

#1 MERIT: Learning Multi-level Representations on Temporal Graphs [PDF] [Copy] [Kimi] [REL]

Authors: Binbin Hu ; Zhengwei Wu ; Jun Zhou ; Ziqi Liu ; Zhigang Huangfu ; Zhiqiang Zhang ; Chaochao Chen

Recently, representation learning on temporal graphs has drawn increasing attention, which aims at learning temporal patterns to characterize the evolving nature of dynamic graphs in real-world applications. Despite effectiveness, these methods commonly ignore the individual- and combinatorial-level patterns derived from different types of interactions (e.g.,user-item), which are at the heart of the representation learning on temporal graphs. To fill this gap, we propose MERIT, a novel multi-level graph attention network for inductive representation learning on temporal graphs.We adaptively embed the original timestamps to a higher, continuous dimensional space for learn-ing individual-level periodicity through Personalized Time Encoding (PTE) module. Furthermore, we equip MERIT with Continuous time and Con-text aware Attention (Coco-Attention) mechanism which chronologically locates most relevant neighbors by jointly capturing multi-level context on temporal graphs. Finally, MERIT performs multiple aggregations and propagations to explore and exploit high-order structural information for down-stream tasks. Extensive experiments on four public datasets demonstrate the effectiveness of MERITon both (inductive / transductive) link prediction and node classification task.