跳到主要导航 跳到搜索 跳到主要内容

Triplex transfer learning: Exploiting both shared and distinct concepts for text classification

  • Fuzhen Zhuang
  • , Ping Luo
  • , Changying Du
  • , Qing He
  • , Zhongzhi Shi
  • , Hui Xiong

科研成果: 期刊稿件文章同行评审

摘要

Transfer learning focuses on the learning scenarios when the test data from target domains and the training data from source domains are drawn from similar but different data distributions with respect to the raw features. Along this line, some recent studies revealed that the high-level concepts, such as word clusters, could help model the differences of data distributions, and thus are more appropriate for classification. In other words, these methods assume that all the data domains have the same set of shared concepts, which are used as the bridge for knowledge transfer. However, in addition to these shared concepts, each domain may have its own distinct concepts. In light of this, we systemically analyze the high-level concepts, and propose a general transfer learning framework based on nonnegative matrix trifactorization, which allows to explore both shared and distinct concepts among all the domains simultaneously. Since this model provides more flexibility in fitting the data, it can lead to better classification accuracy. Moreover, we propose to regularize the manifold structure in the target domains to improve the prediction performances. To solve the proposed optimization problem, we also develop an iterative algorithm and theoretically analyze its convergence properties. Finally, extensive experiments show that the proposed model can outperform the baseline methods with a significant margin. In particular, we show that our method works much better for the more challenging tasks when there are distinct concepts in the data.

源语言英语
文章编号6606822
页(从-至)1191-1203
页数13
期刊IEEE Transactions on Cybernetics
44
7
DOI
出版状态已出版 - 7月 2014
已对外发布

指纹

探究 'Triplex transfer learning: Exploiting both shared and distinct concepts for text classification' 的科研主题。它们共同构成独一无二的指纹。

引用此