Why ReLU is used only on hidden layers specifically?

By Mahesh Pardeshi, 6 months ago
  • Bookmark
0

Use of ReLU Function

Deep learning
Activation function
Relu
1 Answer
0
Shashankshnau1993@gmail.com

ReLU (=max(0,x)) is used to extract feature maps from data. This is why it is used in the hidden layers where we're learning what important characteristics or features the data holds that could make the model learn how to classify for example. In the FC layers, it's time to decide the output, so we usually use sigmoid or softmax, which tend to give us numbers between 0 and 1 (probability) that can give an interpretable result.


Your Answer

Webinars

More webinars

Related Discussions

Running random forest algorithm with one variable

View More