$\color{black}\rule{365px}{3px}$
“ImageNet Classification with Deep Convolutional Neural Networks” - 2012
Link to Paper:
Table of Contents
$\color{black}\rule{365px}{3px}$
Contributions
1st Place in ImageNet Large Scale Visual Recognition Challenge(LSVRC) in 2012
Activation function
First model that introduced ReLU as an activation function. RELU had around 6 times faster performance than Tank and Sigmoid function to reach 25% error rate.
Overfitting
Data augmentation : flip augmentation and Crop augmentation along with adjusting RGB data(jittering) to increase the datasets.
Dropout: rate of 0.5
Local Response Normalization
$\color{black}\rule{365px}{3px}$