Skip to main navigation Skip to search Skip to main content

An Improved Template Representation-based Transformer for Abstractive Text Summarization

  • Jiaming Sun
  • , Yunli Wang
  • , Zhoujun Li*
  • *Corresponding author for this work

Research output: Chapter in Book/Report/Conference proceedingConference contributionpeer-review

Abstract

Text summarization plays an important role in various NLP applications. Using templates with generation methods is an effective way to address abstractive summarization. However, existing template-enhanced generation approaches use templates in a naive way and mainly adopt RNN-based Seq2Seq models, so they cannot make full use of valid information in the templates and suffer from templates' noise. To mitigate these problems, we propose a new abstractive summarization model called Summarization Transformer with Template-aware Representation (STTR), which uses a template-aware document encoding module and a document representation shifting loss to preserve the useful information and filter the noise of the template. The experiments on the Gigaword and LCSTS datasets show that our method outperforms baseline models and achieves a new state-of-the-art.

Original languageEnglish
Title of host publication2020 International Joint Conference on Neural Networks, IJCNN 2020 - Proceedings
PublisherInstitute of Electrical and Electronics Engineers Inc.
ISBN (Electronic)9781728169262
DOIs
StatePublished - Jul 2020
Event2020 International Joint Conference on Neural Networks, IJCNN 2020 - Virtual, Glasgow, United Kingdom
Duration: 19 Jul 202024 Jul 2020

Publication series

NameProceedings of the International Joint Conference on Neural Networks

Conference

Conference2020 International Joint Conference on Neural Networks, IJCNN 2020
Country/TerritoryUnited Kingdom
CityVirtual, Glasgow
Period19/07/2024/07/20

Fingerprint

Dive into the research topics of 'An Improved Template Representation-based Transformer for Abstractive Text Summarization'. Together they form a unique fingerprint.

Cite this