跳到主要导航 跳到搜索 跳到主要内容

Syntax aware lstm model for semantic role labeling

  • Feng Qian
  • , Lei Sha
  • , Baobao Chang
  • , Lu Chen Liu
  • , Ming Zhang*
  • *此作品的通讯作者

科研成果: 书/报告/会议事项章节会议稿件同行评审

摘要

In Semantic Role Labeling (SRL) task, the tree structured dependency relation is rich in syntax information, but it is not well handled by existing models. In this paper, we propose Syntax Aware Long Short Time Memory (SA-LSTM). The structure of SA-LSTM changes according to dependency structure of each sentence, so that SA-LSTM can model the whole tree structure of dependency relation in an architecture engineering way. Experiments demonstrate that on Chinese Proposition Bank (CPB) 1.0, SA-LSTM improves F1 by 2.06% than ordinary bi-LSTM with feature engineered dependency relation information, and gives state-of-the-art F1 of 79.92%. On English CoNLL 2005 dataset, SA-LSTM brings improvement (2.1%) to bi-LSTM model and also brings slight improvement (0.3%) when added to the stateof- the-art model.

源语言英语
主期刊名EMNLP 2017 - Conference on Empirical Methods in Natural Language Processing, Proceedings of the 2nd Workshop on Structured Prediction
出版商Association for Computational Linguistics (ACL)
27-32
页数6
ISBN(电子版)9781945626937
出版状态已出版 - 2017
已对外发布
活动2nd Workshop on Structured Prediction for Natural Language Processing, SPNLP 2017, held in conjunction with the Conference on Empirical Methods in Natural Language Processing, EMNLP 2017 - Copenhagen, 丹麦
期限: 9 9月 201711 9月 2017

出版系列

姓名EMNLP 2017 - Conference on Empirical Methods in Natural Language Processing, Proceedings of the 2nd Workshop on Structured Prediction

会议

会议2nd Workshop on Structured Prediction for Natural Language Processing, SPNLP 2017, held in conjunction with the Conference on Empirical Methods in Natural Language Processing, EMNLP 2017
国家/地区丹麦
Copenhagen
时期9/09/1711/09/17

指纹

探究 'Syntax aware lstm model for semantic role labeling' 的科研主题。它们共同构成独一无二的指纹。

引用此