TY - GEN
T1 - Contrasting Transformer and Hypergraph Network for Cooperative Sequential Recommendation
AU - Wu, Tongyu
AU - Qu, Jianfeng
AU - Wang, Deqing
AU - Cui, Zhiming
AU - Liu, Guanfeng
AU - Zhao, Pengpeng
N1 - Publisher Copyright:
© The Author(s), under exclusive license to Springer Nature Singapore Pte Ltd. 2025.
PY - 2025
Y1 - 2025
N2 - Recently, transformer has been widely used for sequential recommendation due to its superior sequence modeling and information sensing capabilities. Meanwhile, some studies capture high-order cooperative signals between sequences by graph structure. However, the general graph structure is not enough to capture nonlinear high-order cooperative signals and there are no detailed studies to balance the sequence-level information and the global graph-level higher-order information in sequential recommendation. To solve these challenges, we propose a model called Contrasting Transformer and Hypergraph Network for Cooperative Sequential Recommendation (THCSRec) to coordinate sequence-level information with global graph-level information. Specifically, our model uses a transformer network to capture the information of the sequence itself, and a hypergraph neural network to capture the global graph-level high-order information. Furthermore, the two networks cooperate through a contrastive learning task to maximize mutual information. Finally, the representations of the two networks are aggregated for prediction. In the experiments, we conducted extensive evaluation and ablation studies to verify the effectiveness of THCSRec1 on three real datasets, which exceeded the existing SOTA performance lines.1(Our code is available on https://github.com/Elina-wu/THCSRec)
AB - Recently, transformer has been widely used for sequential recommendation due to its superior sequence modeling and information sensing capabilities. Meanwhile, some studies capture high-order cooperative signals between sequences by graph structure. However, the general graph structure is not enough to capture nonlinear high-order cooperative signals and there are no detailed studies to balance the sequence-level information and the global graph-level higher-order information in sequential recommendation. To solve these challenges, we propose a model called Contrasting Transformer and Hypergraph Network for Cooperative Sequential Recommendation (THCSRec) to coordinate sequence-level information with global graph-level information. Specifically, our model uses a transformer network to capture the information of the sequence itself, and a hypergraph neural network to capture the global graph-level high-order information. Furthermore, the two networks cooperate through a contrastive learning task to maximize mutual information. Finally, the representations of the two networks are aggregated for prediction. In the experiments, we conducted extensive evaluation and ablation studies to verify the effectiveness of THCSRec1 on three real datasets, which exceeded the existing SOTA performance lines.1(Our code is available on https://github.com/Elina-wu/THCSRec)
KW - Contrastive Learning
KW - Hpergraph Learning
KW - Sequential Recommendation
UR - https://www.scopus.com/pages/publications/85218460815
U2 - 10.1007/978-981-97-5555-4_6
DO - 10.1007/978-981-97-5555-4_6
M3 - 会议稿件
AN - SCOPUS:85218460815
SN - 9789819755547
T3 - Lecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics)
SP - 83
EP - 98
BT - Database Systems for Advanced Applications - 29th International Conference, DASFAA 2024
A2 - Onizuka, Makoto
A2 - Lee, Jae-Gil
A2 - Tong, Yongxin
A2 - Xiao, Chuan
A2 - Ishikawa, Yoshiharu
A2 - Lu, Kejing
A2 - Amer-Yahia, Sihem
A2 - Jagadish, H.V.
PB - Springer Science and Business Media Deutschland GmbH
T2 - 29th International Conference on Database Systems for Advanced Applications, DASFAA 2024
Y2 - 2 July 2024 through 5 July 2024
ER -