Regularizing Deep Text Models by Encouraging Competition

  • Jiaran Li*
  • , Richong Zhang
  • , Yuan Tian
  • *Corresponding author for this work

Research output: Chapter in Book/Report/Conference proceedingConference contributionpeer-review

Abstract

The difficulty in acquiring a large amount of labelled training data and the demand of complex neural network models in text learning make developing effective regularization techniques an important research topic. In this paper, we present a novel regularization scheme for supervised text learning, Competitive Word Dropout, or CWD. Experiments on three different natural language learning tasks demonstrate that CWD outperforms significantly the standard regularization schemes such as weight decay and dropout. The CWD scheme has another unique advantage, namely that it can be interpreted semantically.

Original languageEnglish
Title of host publicationKnowledge Graph and Semantic Computing
Subtitle of host publicationKnowledge Graph Empowers the Digital Economy - 7th China Conference, CCKS 2022, Revised Selected Papers
EditorsMaosong Sun, Bin Xu, Guilin Qi, Kang Liu, Yubo Chen, Jiadong Ren, Yansong Feng, Yongbin Liu
PublisherSpringer Science and Business Media Deutschland GmbH
Pages161-173
Number of pages13
ISBN (Print)9789811975950
DOIs
StatePublished - 2022
Event7th China Conference on Knowledge Graph and Semantic Computing, CCKS 2022 - Qinhuangdao, China
Duration: 24 Aug 202227 Aug 2022

Publication series

NameCommunications in Computer and Information Science
Volume1669 CCIS
ISSN (Print)1865-0929
ISSN (Electronic)1865-0937

Conference

Conference7th China Conference on Knowledge Graph and Semantic Computing, CCKS 2022
Country/TerritoryChina
CityQinhuangdao
Period24/08/2227/08/22

Keywords

  • Deep learning
  • Regularization
  • Text learning
  • Word dropout
  • Word embedding

Fingerprint

Dive into the research topics of 'Regularizing Deep Text Models by Encouraging Competition'. Together they form a unique fingerprint.

Cite this