English
最新文章
当前位置: 首页 > 正文
【Neural Networks】Enhancing signed graph neural networks through curriculum-based training
来源: 时间:2025-08-20

Zeyu Zhang, Lu Li, Xingyu Ji, Kaiqi Zhao, Xiaofeng Zhu, Philip S. Yu, Jiawei Li*, Maojun Wang*

Neural NetworksVolume 193, January 2026, 107975

Abstract

Signed graphs are powerful models for representing complex relations with both positive and negative connections. Recently, Signed Graph Neural Networks (SGNNs) have emerged as potent tools for analyzing such graphs. To our knowledge, no prior research has been conducted on devising a training plan specifically for SGNNs. The prevailing training approach feeds samples (edges) to models in a random order, resulting in equal contributionsfrom each sample during the training process, but fails to account for varying learning difficulties based on the graph’s structure. We contend that SGNNs can benefit from a curriculum that progresses from easy to difficult, similar to human learning. The main challenge is evaluating the difficulty of edges in a signed graph. Weaddress this by theoretically analyzing the difficulty of SGNNs in learning adequate representations for edges in unbalanced cycles and propose a lightweight difficulty measurer. This forms the basis for our innovative Curriculum representation learning framework for Signed Graphs, referred to as CSG. The process involves using the measurer to assign difficulty scores to training samples, adjusting their order using a scheduler and training the SGNN model accordingly. We empirically our approach on six real-world signed graph datasets. Our method demonstrates remarkable results, enhancing the accuracy of popular SGNN models by up to 23.7 % and showing a reduction of 8.4 % in standard deviation, enhancing model stability. Our implementation is available in PyTorch (https://github.com/Alex-Zeyu/CSG).

Keywords

Graph neural networks; Signed graph representation learning; Curriculum learning

原文链接:https://doi.org/10.1016/j.neunet.2025.107975