Abstract
Graph contrastive learning (GCL) relies on acquiring high-quality positive and negative samples to learn the structural semantics of the input graph. Previous approaches typically sampled negative samples from the same training batch or an irrelevant external graph. However, this approach is limited by the problem of sampling false negatives. To address this limitation, this paper introduces a novel method called CGC. CGC uses a counterfactual mechanism to generate hard negative samples that are similar to positive samples but have different semantics. However, CGC faces the challenge of high space complexity due to the storage requirements of augmented graphs. To overcome this challenge, an expansion of CGC is proposed in this paper, called CCGC (compressed CGC), which incorporates a compression module using knowledge distillation techniques to compress the features of augmented graphs. The effectiveness of the proposed method is demonstrated through satisfactory results on multiple datasets, outperforming traditional unsupervised graph learning methods and state-of-the-art GCL methods. Supplementary experiments are conducted to compare CGC and CCGC at different compression ratios.
| Original language | English |
|---|---|
| Journal | CAAI Transactions on Intelligence Technology |
| DOIs | |
| State | Accepted/In press - 2026 |
| Externally published | Yes |
Keywords
- artificial neural networks
- data mining
- graph contrastive learning
Fingerprint
Dive into the research topics of 'Generating Compressed Counterfactual Hard Negative Samples for Graph Contrastive Learning'. Together they form a unique fingerprint.Cite this
- APA
- Author
- BIBTEX
- Harvard
- Standard
- RIS
- Vancouver