Paper Title
DiNET - A New Approach of Mitigating Vanishing Gradients with A Dense-Inception Network
Abstract
In the classic Inception Networks, auxiliary classifiers have been added into bottom or middle of the deep convolutional neural network to prevent the lower layers of the model from dying out during training. While the approach is effective of mitigating vanishing gradient problems, efficiency is sacrificed. We develop a new model architecture what combines DenseBlock and the classical Inception Module that we call DiNET to investigate the tradeoffs between model efficiency and the effectiveness of mitigating vanishing gradients. Our results show that DiNET1 achieves a modest performance improvement and reduces the number of model performance. In our future work, later versions of DiNET will be developed to shed more light on the effect of strategically positioning more than one DiNET modules (DiNETx) on the trade-offs between model efficiency and performance of fighting vanishing gradients.
Keywords - Deep Neural Network, Convolutional Neural Network, Image Classification, Machine Learning.