0
点赞
收藏
分享

微信扫一扫

170904 Training Deep Neural Networks on Noisy Labels with Bootstrapping-Notes(TBC)


References
What is the difference between the detection, recognition and identification of things?
A Friendly Introduction to Cross-Entropy Loss
The cross-entropy error function in neural networks
起源:传统激活函数、脑神经元激活频率研究、稀疏激活性
* Cross-entropy loss explanation
【机器学习】Softmax Regression简介
Visualizing weights and convolutions
CNN (keras example) on mnist (from keras’s examples) with convolution visualization

The cross-entropy loss does not depend on what the values of incorrect class probabilities are. Yes, this is a key feature of multiclass logloss, it rewards/penalises probabilities of correct classes only. The value is independent of how the remaining probability is split between incorrect classes.


举报

相关推荐

0 条评论