TY - GEN
T1 - Triplex transfer learning
T2 - 6th ACM International Conference on Web Search and Data Mining, WSDM 2013
AU - Zhuang, Fuzhen
AU - Luo, Ping
AU - Du, Changying
AU - He, Qing
AU - Shi, Zhongzhi
PY - 2013
Y1 - 2013
N2 - Transfer learning focuses on the learning scenarios when the test data from target domains and the training data from source domains are drawn from similar but different data distribution with respect to the raw features. Some recent studies argued that the high-level concepts (e.g. word clusters) can help model the data distribution difference, and thus are more appropriate for classification. Specifically, these methods assume that all the data domains have the same set of shared concepts, which are used as the bridge for knowledge transfer. However, besides these shared concepts each domain may have its own distinct concepts. To address this point, we propose a general transfer learning framework based on non-negative matrix tri-factorization which allows to explore both shared and distinct concepts among all the domains simultaneously. Since this model provides more flexibility in fitting the data it may lead to better classification accuracy. To solve the proposed optimization problem we develop an iterative algorithm and also theoretically analyze its convergence. Finally, extensive experiments show the significant superiority of our model over the baseline methods. In particular, we show that our method works much better in the more challenging tasks when distinct concepts may exist.
AB - Transfer learning focuses on the learning scenarios when the test data from target domains and the training data from source domains are drawn from similar but different data distribution with respect to the raw features. Some recent studies argued that the high-level concepts (e.g. word clusters) can help model the data distribution difference, and thus are more appropriate for classification. Specifically, these methods assume that all the data domains have the same set of shared concepts, which are used as the bridge for knowledge transfer. However, besides these shared concepts each domain may have its own distinct concepts. To address this point, we propose a general transfer learning framework based on non-negative matrix tri-factorization which allows to explore both shared and distinct concepts among all the domains simultaneously. Since this model provides more flexibility in fitting the data it may lead to better classification accuracy. To solve the proposed optimization problem we develop an iterative algorithm and also theoretically analyze its convergence. Finally, extensive experiments show the significant superiority of our model over the baseline methods. In particular, we show that our method works much better in the more challenging tasks when distinct concepts may exist.
KW - common concept
KW - distinct concept
KW - distribution mismatch
KW - non-negative matrix tri-factorization
KW - triplex transfer learning
UR - https://www.scopus.com/pages/publications/84874262351
U2 - 10.1145/2433396.2433449
DO - 10.1145/2433396.2433449
M3 - 会议稿件
AN - SCOPUS:84874262351
SN - 9781450318693
T3 - WSDM 2013 - Proceedings of the 6th ACM International Conference on Web Search and Data Mining
SP - 425
EP - 434
BT - WSDM 2013 - Proceedings of the 6th ACM International Conference on Web Search and Data Mining
Y2 - 4 February 2013 through 8 February 2013
ER -