TY - GEN
T1 - Compressing Knowledge Graph Embedding with Relational Graph Auto-encoder
AU - Zhang, Shiyu
AU - Zhang, Zhao
AU - Zhuang, Fuzhen
AU - Shi, Zhiping
AU - Han, Xu
N1 - Publisher Copyright:
© 2020 IEEE.
PY - 2020/7
Y1 - 2020/7
N2 - Knowledge graphs (KGs) are extremely useful resources for varieties of applications. However, with the large and steadily growing sizes of modern KGs, knowledge graph embeddings (KGE), which represent entities and relations in KGs into 32-bit floating-point vectors, become more and more expensive in terms of memory. To this end, in this paper, we propose a general framework to compress the embeddings from real-valued vectors to binary ones while preserving the inherent information of KGs. Specifically, the proposed framework utilizes relational graph auto-encoders as well as the Gumbel-Softmax trick to obtain the compressed representations. Our framework can be applied to a number of existing KGE models. Particularly, we extend state-of-the-art models TransE, DistMult, and ConvE in this paper. Finally, extensive experiments show that the proposed method successfully reduces the memory size of the embeddings by 92% while only leading to a loss of no more than 5% in the knowledge graph completion task.
AB - Knowledge graphs (KGs) are extremely useful resources for varieties of applications. However, with the large and steadily growing sizes of modern KGs, knowledge graph embeddings (KGE), which represent entities and relations in KGs into 32-bit floating-point vectors, become more and more expensive in terms of memory. To this end, in this paper, we propose a general framework to compress the embeddings from real-valued vectors to binary ones while preserving the inherent information of KGs. Specifically, the proposed framework utilizes relational graph auto-encoders as well as the Gumbel-Softmax trick to obtain the compressed representations. Our framework can be applied to a number of existing KGE models. Particularly, we extend state-of-the-art models TransE, DistMult, and ConvE in this paper. Finally, extensive experiments show that the proposed method successfully reduces the memory size of the embeddings by 92% while only leading to a loss of no more than 5% in the knowledge graph completion task.
KW - Compression
KW - Graph autoencoders
KW - Knowledge graph completion
KW - Knowledge graph embedding
UR - https://www.scopus.com/pages/publications/85090426329
U2 - 10.1109/ICEIEC49280.2020.9152323
DO - 10.1109/ICEIEC49280.2020.9152323
M3 - 会议稿件
AN - SCOPUS:85090426329
T3 - ICEIEC 2020 - Proceedings of 2020 IEEE 10th International Conference on Electronics Information and Emergency Communication
SP - 366
EP - 370
BT - ICEIEC 2020 - Proceedings of 2020 IEEE 10th International Conference on Electronics Information and Emergency Communication
A2 - Li, Wenzheng
A2 - Zhang, Xuefei
PB - Institute of Electrical and Electronics Engineers Inc.
T2 - 10th IEEE International Conference on Electronics Information and Emergency Communication, ICEIEC 2020
Y2 - 17 July 2020 through 19 July 2020
ER -