Pairwise Gaussian Loss for Convolutional Neural Networks

View Researcher's Other Codes

Disclaimer: The provided code links for this paper are external links. Science Nest has no responsibility for the accuracy, legality or content of these links. Also, by downloading this code(s), you agree to comply with the terms of use as set out by the author(s) of the code(s).

Please contact us in case of a broken link from here

Authors Yuxiang Qin, Chungang Yan, Guanjun Liu, Zhenchuan Li, Changjun Jiang
Journal/Conference Name IEEE Transactions on Industrial Informatics
Paper Category
Paper Abstract Convolutional neural networks (CNNs) have demonstrated great competence in feature representation, and then, achieved a good performance to many classification tasks. Cross-entropy loss, together with softmax, is arguably one of the most commonly used loss functions in CNNs (that is generally called softmax loss). However, the softmax loss can result in a weakly discriminative feature representation since it focuses on the interclass separability rather than the intraclass compactness. This article proposes a pairwise Gaussian loss (PGL) for CNNs that can well address the intraclass compactness through significantly penalizing those similar sample pairs with a relatively large distance. At the same time, PGL can still ensure a good interclass separability. Experiments show that PGL can guarantee that CNNs obtain a better classification performance compared to not only the softmax loss but also others often used in CNNs. Our experiments also show that PGL has a stable convergence for the stochastic gradient descent optimization method in CNNs and a good generalization ability for different structures of CNNs.
Date of publication 2020
Code Programming Language Python
Comment

Copyright Researcher 2022