Correlations of cross-entropy loss in machine learning
Abstract
Cross-entropy loss is crucial in training many deep neural networks. In this context, we show a number of novel and strong correlations among various related divergence functions. In particular, we demonstrate that, in some circumstances, (a) cross-entropy is almost perfectly correlated with the little-known triangular divergence, and (b) cross-entropy is strongly correlated with the Euclidean distance over the logits from which the softmax is derived. The consequences of these observations are as follows. First, triangular divergence may be used as a cheaper alternative to cross-entropy. Second, logits can be used as features in a Euclidean space which is strongly synergistic with the classification process. This justifies the use of Euclidean distance over logits as a measure of similarity, in cases where the network is trained using softmax and cross-entropy. We establish these correlations via empirical observation, supported by a mathematical explanation encompassing a number of strongly related divergence functions.
Citation
Connor , R , Dearle , A , Claydon , B & Vadicamo , L 2024 , ' Correlations of cross-entropy loss in machine learning ' , Entropy , vol. 26 , no. 6 , 491 . https://doi.org/10.3390/e26060491
Publication
Entropy
Status
Peer reviewed
ISSN
1099-4300Type
Journal article
Collections
Items in the St Andrews Research Repository are protected by copyright, with all rights reserved, unless otherwise indicated.