A Transformational Biencoder with In-Domain Negative Sampling for Zero-Shot Entity Linking

Research output: Chapter in Book/Report/Conference proceedingConference contributionpeer-review

Abstract

Recent interest in entity linking has focused in the zero-shot scenario, where at test time the entity mention to be labelled is never seen during training, or may belong to a different domain from the source domain. Current work leverage pre-trained BERT has the implicit assumption that BERT bridges the gap between the source and target domain distributions. However, fine-tuned BERT has a considerable underperformance at zero-shot when applied in a different domain. We solve this problem by proposing a Transformational Biencoder that incorporates a transformation into BERT to perform a zero-shot transfer from the source domain during training. As like previous work, we rely on negative entities to encourage our model to discriminate the golden entities during training. To generate these negative entities, we propose a simple but effective strategy that takes the domain of the golden entity into perspective. Our experimental results on the benchmark dataset Zeshel show effectiveness of our approach and achieve new state-of-the-art.

Original languageEnglish
Title of host publicationACL 2022 - 60th Annual Meeting of the Association for Computational Linguistics, Findings of ACL 2022
EditorsSmaranda Muresan, Preslav Nakov, Aline Villavicencio
PublisherAssociation for Computational Linguistics (ACL)
Pages1449-1458
Number of pages10
ISBN (Electronic)9781955917254
DOIs
StatePublished - 2022
EventFindings of the Association for Computational Linguistics: ACL 2022 - Dublin, Ireland
Duration: 22 May 202227 May 2022

Publication series

NameProceedings of the Annual Meeting of the Association for Computational Linguistics
ISSN (Print)0736-587X

Conference

ConferenceFindings of the Association for Computational Linguistics: ACL 2022
Country/TerritoryIreland
CityDublin
Period22/05/2227/05/22

Fingerprint

Dive into the research topics of 'A Transformational Biencoder with In-Domain Negative Sampling for Zero-Shot Entity Linking'. Together they form a unique fingerprint.

Cite this