TY - GEN
T1 - Graph and Question Interaction Aware Graph2Seq Model for Knowledge Base Question Generation
AU - Li, Chen
AU - Bai, Jun
AU - Wang, Chuanarui
AU - Hu, Yuanhao
AU - Rong, Wenge
AU - Xiong, Zhang
N1 - Publisher Copyright:
© 2022 IEEE.
PY - 2022
Y1 - 2022
N2 - The Knowledge Base Question Generation (KBQG) is an essential natural language processing task. Taking knowledge graph and answer entities as input, KBQG aims to generate corresponding natural language question. Recently Graph2Seq has been proposed to encode the knowledge graph and achieved remarkable results, while one important challenge still remains, i.e., the graph encoding lacks the interaction with the target question. To deal with the above challenge, we propose a graph and question interaction enhanced Graph2Seq model, in which we design an encoder-decoder parallel enhancement mechanism and apply the knowledge distillation for both inter-mediate representation and prediction distribution to employ the knowledge of the target question into the graph representation. Experiments have been conducted on KBQG benchmark dataset and experimental results have shown the promising potential of proposed method.
AB - The Knowledge Base Question Generation (KBQG) is an essential natural language processing task. Taking knowledge graph and answer entities as input, KBQG aims to generate corresponding natural language question. Recently Graph2Seq has been proposed to encode the knowledge graph and achieved remarkable results, while one important challenge still remains, i.e., the graph encoding lacks the interaction with the target question. To deal with the above challenge, we propose a graph and question interaction enhanced Graph2Seq model, in which we design an encoder-decoder parallel enhancement mechanism and apply the knowledge distillation for both inter-mediate representation and prediction distribution to employ the knowledge of the target question into the graph representation. Experiments have been conducted on KBQG benchmark dataset and experimental results have shown the promising potential of proposed method.
KW - Graph and Question Interaction
KW - Knowledge Distillation
KW - Knowledge Graph
KW - Question Generation
UR - https://www.scopus.com/pages/publications/85140733443
U2 - 10.1109/IJCNN55064.2022.9892028
DO - 10.1109/IJCNN55064.2022.9892028
M3 - 会议稿件
AN - SCOPUS:85140733443
T3 - Proceedings of the International Joint Conference on Neural Networks
BT - 2022 International Joint Conference on Neural Networks, IJCNN 2022 - Proceedings
PB - Institute of Electrical and Electronics Engineers Inc.
T2 - 2022 International Joint Conference on Neural Networks, IJCNN 2022
Y2 - 18 July 2022 through 23 July 2022
ER -