[会议]Advanced Data Mining and Applications  Haibo Liu, Di Zhang, Liang Wang, Xin Song

摘要: Knowledge Distillation (KD) as a model compression technique has been widely used in Graph Neural Networks (GNNs). Recently research has demonstrated that GNNs with shallow or deep hidden layers have different representational abi... 展开

翻译摘要
作者 Haibo Liu   Di Zhang   Liang Wang   Xin Song  
作者单位
文集名称 Advanced Data Mining and Applications
出版年 2023
会议名称 International Conference on Advanced Data Mining and Applications  
卷/页码 Part 4 / 502-516 开始页/总页数 00000502 / 15
会议地点 Shenyang(CN) 会议年/会议届次 2023 / 19th
关键词 Knowledge Distillation   Multi-Teacher Model   Local Semantics   GNN   Adaptive Weights  
馆藏号 P2400415
相关作者
相关关键词