Question:
Activation function and Pooling
Author: Christian NAnswer:
An activation function (ReLU) is applied to the feature map • A (max) pooling layer groups together a selection of pixels and selects the max • This identifies the area of image where the filter found the best match • Making the network robust against small shifts in the image • And reducing the size of the input while keeping the important information
0 / 5 (0 ratings)
1 answer(s) in total