ConvNets with Smooth Adaptive Activation Functions for Regression
Within Neural Networks (NN), the parameters of Adaptive Activation Functions (AAF) control the shapes of activation functions. These parameters are trained along with other parameters in the NN. AAFs have improved performance of Convolutional Neural Networks (CNN) in multiple classification tasks. In this paper, we propose and apply AAFs on CNNs for regression tasks. We argue that applying AAFs in the regression (second-to-last) layer of a NN can significantly decrease the bias of the regression NN. However, using existing AAFs may lead to overfitting. To address this problem, we propose a Smooth Adaptive Activation Function (SAAF) with a piecewise polynomial form which can ap- proximate any continuous function to arbi- trary degree of error, while having a bounded Lipschitz constant for given bounded model parameters. As a result, NNs with SAAF can avoid overfitting by simply regularizing model parameters. We empirically evaluated CNNs with SAAFs and achieved state-of-the-art results on age and pose estimation datasets.
- Selected publications:
- ConvNets with Smooth Adaptive Activation Functions for Regression, Hou, Le and Samaras, Dimitris and Kurc, Tahsin M and Gao, Yi and Saltz, Joel H, Conference on Artificial Intelligence and Statistics (AIStats), 2017
[ BibTex ] [ Code ]
- Le Hou