TY - JOUR
T1 - Structural Entropy Guided Meta-Learning for Few-Shot Node Classification
AU - Chen, Xiang
AU - Yue, Kun
AU - Liu, Daliang
AU - Liu, Wenjie
AU - Duan, Liang
AU - Li, Angsheng
N1 - Publisher Copyright:
© 1989-2012 IEEE.
PY - 2026
Y1 - 2026
N2 - Few-shot node classification (FSNC) is a challenging task in graph analysis, where the goal is to classify unlabeled nodes in a graph using only a few labeled nodes as references. To tackle the label shortage problem, many meta-learning methods have been proposed to extract meta-knowledge from base classes with abundant labeled nodes and transfer the learned knowledge to classify nodes from novel classes. However, the theoretical foundation of meta-knowledge remains unexplored, and existing solutions often struggle when dealing with complex or noisy graphs. To address these issues, we propose a novel and effective meta-learning framework for FSNC based on structural information theory. First, we introduce the concept of minimal sufficient meta-knowledge, a theoretical principle inherited from information bottleneck, which optimally balances the expressiveness and robustness of the learned meta-knowledge. Guided by this principle, we develop a meta-learning model, named SE-FSNC, that extracts the minimal sufficient meta-knowledge using an encoding tree derived from the input graph with minimal structural entropy. We then propose an effective algorithm to train SE-FSNC by incorporating the encoding tree with graph contrastive learning. Extensive experiments on several datasets demonstrate the superiority of our model compared with other state-of-the-art methods.
AB - Few-shot node classification (FSNC) is a challenging task in graph analysis, where the goal is to classify unlabeled nodes in a graph using only a few labeled nodes as references. To tackle the label shortage problem, many meta-learning methods have been proposed to extract meta-knowledge from base classes with abundant labeled nodes and transfer the learned knowledge to classify nodes from novel classes. However, the theoretical foundation of meta-knowledge remains unexplored, and existing solutions often struggle when dealing with complex or noisy graphs. To address these issues, we propose a novel and effective meta-learning framework for FSNC based on structural information theory. First, we introduce the concept of minimal sufficient meta-knowledge, a theoretical principle inherited from information bottleneck, which optimally balances the expressiveness and robustness of the learned meta-knowledge. Guided by this principle, we develop a meta-learning model, named SE-FSNC, that extracts the minimal sufficient meta-knowledge using an encoding tree derived from the input graph with minimal structural entropy. We then propose an effective algorithm to train SE-FSNC by incorporating the encoding tree with graph contrastive learning. Extensive experiments on several datasets demonstrate the superiority of our model compared with other state-of-the-art methods.
KW - Few-shot node classification
KW - encoding tree
KW - information bottleneck
KW - meta-learning
KW - structural entropy
UR - https://www.scopus.com/pages/publications/105028396277
U2 - 10.1109/TKDE.2026.3656714
DO - 10.1109/TKDE.2026.3656714
M3 - 文章
AN - SCOPUS:105028396277
SN - 1041-4347
JO - IEEE Transactions on Knowledge and Data Engineering
JF - IEEE Transactions on Knowledge and Data Engineering
ER -