Why ReLU is used only on hidden layers specifically?

By Mahesh Pardeshi, 4 months ago
  • Bookmark
0

Use of ReLU Function

Deep learning
Activation function
Relu
1 Answer
0
Shashankshnau1993@gmail.com

ReLU (=max(0,x)) is used to extract feature maps from data. This is why it is used in the hidden layers where we're learning what important characteristics or features the data holds that could make the model learn how to classify for example. In the FC layers, it's time to decide the output, so we usually use sigmoid or softmax, which tend to give us numbers between 0 and 1 (probability) that can give an interpretable result.


Your Answer

Webinars

How Artificial Intelligence Works and How To Make Career In AI?

Oct 2nd (11:00 AM) 354 Registered
More webinars

Related Discussions

Running random forest algorithm with one variable

View More
BOT
Agent(Online)
We're Online!

Chat now for any query