作者
Shai Ben-David, John Blitzer, Koby Crammer, Fernando Pereira
发表日期
2007/12
期刊
Advances in neural information processing systems
卷号
19
页码范围
137
出版商
MIT; 1998
简介
Discriminative learning methods for classification perform well when training and test data are drawn from the same distribution. In many situations, though, we have labeled training data for a source domain, and we wish to learn a classifier which performs well on a target domain with a different distribution. Under what conditions can we adapt a classifier trained on the source domain for use in the target domain? Intuitively, a good feature representation is a crucial factor in the success of domain adaptation. We formalize this intuition theoretically with a generalization bound for domain adaption. Our theory illustrates the tradeoffs inherent in designing a representation for domain adaptation and gives a new justification for a recently proposed model. It also points toward a promising new model for domain adaptation: one which explicitly minimizes the difference between the source and target domains, while at the same time maximizing the margin of the training set.
引用总数
20072008200920102011201220132014201520162017201820192020202120222023202413233124425261547376107128177272370397412238
学术搜索中的文章
S Ben-David, J Blitzer, K Crammer, F Pereira - Advances in neural information processing systems, 2006