TY - GEN
T1 - Multie
T2 - 27th ACM International Conference on Information and Knowledge Management, CIKM 2018
AU - Zhang, Zhao
AU - Zhuang, Fuzhen
AU - Niu, Zheng Yu
AU - Wang, Deqing
AU - He, Qing
N1 - Publisher Copyright:
© 2018 Association for Computing Machinery.
PY - 2018/10/17
Y1 - 2018/10/17
N2 - Completing knowledge bases (KBs) with missing facts is of great importance, since most existing KBs are far from complete. To this end, many knowledge base completion (KBC) methods have been proposed. However, most existing methods embed each relation into a vector separately, while ignoring the correlations among different relations. Actually, in large-scale KBs, there always exist some relations that are semantically related, and we believe this can help to facilitate the knowledge sharing when learning the embedding of related relations simultaneously. Along this line, we propose a novel KBC model by Multi-Task Embedding, named MultiE. In this model, semantically related relations are first clustered into the same group, and then learning the embedding of each relation can leverage the knowledge among different relations. Moreover, we propose a three-layer network to predict the missing values of incomplete knowledge triples. Finally, experiments on three popular benchmarks FB15k, FB15k-237 and WN18 are conducted to demonstrate the effectiveness of MultiE against some state-of-the-art baseline competitors.
AB - Completing knowledge bases (KBs) with missing facts is of great importance, since most existing KBs are far from complete. To this end, many knowledge base completion (KBC) methods have been proposed. However, most existing methods embed each relation into a vector separately, while ignoring the correlations among different relations. Actually, in large-scale KBs, there always exist some relations that are semantically related, and we believe this can help to facilitate the knowledge sharing when learning the embedding of related relations simultaneously. Along this line, we propose a novel KBC model by Multi-Task Embedding, named MultiE. In this model, semantically related relations are first clustered into the same group, and then learning the embedding of each relation can leverage the knowledge among different relations. Moreover, we propose a three-layer network to predict the missing values of incomplete knowledge triples. Finally, experiments on three popular benchmarks FB15k, FB15k-237 and WN18 are conducted to demonstrate the effectiveness of MultiE against some state-of-the-art baseline competitors.
KW - Embedding
KW - Knowledge Base Completion
KW - Multi-Task Learning
UR - https://www.scopus.com/pages/publications/85058037703
U2 - 10.1145/3269206.3269295
DO - 10.1145/3269206.3269295
M3 - 会议稿件
AN - SCOPUS:85058037703
T3 - International Conference on Information and Knowledge Management, Proceedings
SP - 1715
EP - 1718
BT - CIKM 2018 - Proceedings of the 27th ACM International Conference on Information and Knowledge Management
A2 - Paton, Norman
A2 - Candan, Selcuk
A2 - Wang, Haixun
A2 - Allan, James
A2 - Agrawal, Rakesh
A2 - Labrinidis, Alexandros
A2 - Cuzzocrea, Alfredo
A2 - Zaki, Mohammed
A2 - Srivastava, Divesh
A2 - Broder, Andrei
A2 - Schuster, Assaf
PB - Association for Computing Machinery
Y2 - 22 October 2018 through 26 October 2018
ER -