[Google_Bootcamp_Day19]
Updated:
Convolutional Neural Network 3
1 * 1 convolutions
- Network in network
- Use 1 * 1 convolution when you want to shrink the size of channel
- similar as fully-connected layer
Inception Network
-
Idea : let the network learn whatever parameters it wants to use, whatever the combinations of these filter sizes it wants
- Problem : Computational cost
- Solution : using 1 * 1 convolution
Inception Module
Practical Advice
- Use open-source implementations
- Use Transer Learning
- Freeze pre-trained model and train last few layers that you want to target
- If dataset = large, freeze few layers then train others
- If dataset = very large, initialize weights with pre-trained weights, then train the whole pre-trained model
- Data Augmentation
- Mirroring on the vertical axis
- Random cropping
- Color Shifting
- Rotation
- Shearing
- Local Warping
[source] https://www.coursera.org/learn/convolutional-neural-networks
Leave a comment