咨询与建议

看过本文的还看了

相关文献

该作者的其他文献

文献详情 >Self-supervised graph learning... 收藏

Self-supervised graph learning with target-adaptive masking for session-based recommendation

[融合自监督图学习与目标自适应屏蔽的会话型推荐方法]

作     者:Yitong WANG Fei CAI Zhiqiang PAN Chengyu SONG Yitong WANG;Fei CAI;Zhiqiang PAN;Chengyu SONG

作者机构:Science and Technology on Information Systems Engineering LaboratoryNational University of Defense TechnologyChangsha 410073China 

出 版 物:《Frontiers of Information Technology & Electronic Engineering》 (信息与电子工程前沿(英文版))

年 卷 期:2023年第24卷第1期

页      面:73-87页

核心收录:

学科分类:12[管理学] 1201[管理学-管理科学与工程(可授管理学、工学学位)] 081104[工学-模式识别与智能系统] 08[工学] 0835[工学-软件工程] 0811[工学-控制科学与工程] 0812[工学-计算机科学与技术(可授工学、理学学位)] 

主  题:Session-based recommendation Self-supervised learning Graph neural networks Target-adaptive 

摘      要:Session-based recommendation aims to predict the next item based on a user’s limited interactions within a short *** approaches use mainly recurrent neural networks(RNNs)or graph neural networks(GNNs)to model the sequential patterns or the transition relationships between ***,such models either ignore the over-smoothing issue of GNNs,or directly use cross-entropy loss with a softmax layer for model optimization,which easily results in the over-fitting *** tackle the above issues,we propose a self-supervised graph learning with target-adaptive masking(SGL-TM)***,we first construct a global graph based on all involved sessions and subsequently capture the self-supervised signals from the global connections between items,which helps supervise the model in generating accurate representations of items in the ongoing *** that,we calculate the main supervised loss by comparing the ground truth with the predicted scores of items adjusted by our designed target-adaptive masking ***,we combine the main supervised component with the auxiliary self-supervision module to obtain the final loss for optimizing the model *** experimental results from two benchmark datasets,Gowalla and Diginetica,indicate that SGL-TM can outperform state-of-the-art baselines in terms of Recall@20 and MRR@20,especially in short sessions.

读者评论 与其他读者分享你的观点

用户名:未登录
我的评分