作者
Xiaoxu Li, Dongliang Chang, Tao Tian, Jie Cao
发表日期
2019/2/5
期刊
IEEE access
卷号
7
页码范围
19572-19578
出版商
IEEE
简介
Softmax cross-entropy loss with L2 regularization is commonly adopted in the machine learning and neural network community. Considering that the traditional softmax cross-entropy loss simply focuses on fitting or classifying the training data accurately but does not explicitly encourage a large decision margin for classification, some loss functions are proposed to improve the generalization performance by solving the problem. However, these loss functions enhance the difficulty of model optimization. In addition, inspired by regularized logistic regression, where the regularized term is responsible for adjusting the width of decision margin, which can be seen as an approximation of support vector machine, we proposed a large-margin regularization method for softmax cross-entropy loss. The advantages of the proposed loss are twofold as follows: the first is the generalization performance improvement, and the …
引用总数
20192020202120222023202433910113
学术搜索中的文章