Multi-Scale Contrastive Siamese Networks for Self-Supervised Graph Representation Learning

Abstract

Graph representation learning plays a vital role in processing graph-structured data. However, prior arts on graph representation learning heavily rely on labeling information. To overcome this problem, inspired by the recent success of graph contrastive learning and Siamese networks in visual representation learning, we propose a novel self-supervised approach in this paper to learn node representations by enhancing Siamese self-distillation with multi-scale contrastive learning. Specifically, we first generate two augmented views from the input graph based on local and global perspectives. Then, we employ two objectives called cross-view and cross-network contrastiveness to maximize the agreement between node representations across different views. To demonstrate the effectiveness of our approach, we perform empirical experiments on five real-world datasets. Our method not only achieves new state-of-the-art results but also surpasses some semi-supervised counterparts by large margins.

Publication
International Joint Conference on Artificial Intelligence, IJCAI-21
Ming Jin
Ming Jin
Lecturer of AI

My research interests include machine learning and artificial intelligence.

Yizhen Zheng
Yizhen Zheng
PhD Student @ Monash (07/2021-)

My research interests include machine learning and artificial intelligence.

Shirui Pan
Shirui Pan
Professor and ARC Future Fellow

My research interests include data mining, machine learning, and graph analysis.