Paper Title
Skin Cancer Detection using Capsule Net

Abstract
In recent decades, there have been numerous investigations concentrated on early detection of skin cancer with non-invasive or negligibly intrusive strategies in lieu of conventional excisional biopsy. The early, fast and effective detection of skin cancer is very crucial as it makes treatment easier and increasingly successful. Researchers are proficient in pre-processing the skin images but fail in identifying efficient classifiers for classifying the skin cancer due to complexity with variety of lesion sizes, colors, and shapes. No single classifier is sufficient for classifying the skin cancer. Recently, convolutional neural networks have played an important role in deep learning. Classification using CNN’s have proven successful in various fields. However, CNNs suffer from not taking important spatial relations between features into consideration. Also, they lack rotational invariance. The CNNs classify only if certain features are present in the test data ignoring the relative spatial relation with each other, resulting in false positives. The lack of rotational invariance assigns an object to other classes, leading to false negatives. The capsule network is designed to overcome the problems associated with CNNs. Capsule networks use modules or capsules other than pooling as an alternative to translational invariance. The capsule network uses layer-based squashing and dynamic routing. The Capsule network uses vector-output capsules and max-pooling with routing by agreement, unlike scale-output feature detectors of CNNs. These assist in avoiding the false positives and false negatives. The capsule network architecture is created with many convolution layers and one capsule layer as the final layer. Hence, in the proposed work, skin cancer classification is performed based on CapsNet architecture which can work well with high dimensional hyperspectral images of skin. Index Terms - Computer Aided Diagnosis, Skin Cancer Detection, Capsule Network, CNN.