#### World's Best AI Learning Platform with profoundly Demanding Certification Programs

Designed by IITian's, only for AI Learners.

Download our e-book of Introduction To Python

Which sorting technique is used by sort() and sorted() functions of python? Why ReLU is used only on hidden layers specifically? How garbage collection implemented in python? What are local and global scope? Open a text file and find the longest word in the text file and find the length. How to plot Bubble plot with Encircling? What is use of Heat map ? How to plot heat map? How to extracting ZIP Files in Python? Join Discussion

4 (4,001 Ratings)

220 Learners

Kajal Pawar

10 months ago

Tanh is also known as **hyperbolic tangent function**. The
curves of tanh function and sigmoid function are relatively similar as we can
see from the image below.

hyperbolic tangent function

Let ’s compares both of them. When the input is large or small,
the output is almost smooth and the gradient is small, which is not conducive
to weight update. The difference is the output interval.

The output interval of tanh is **1,** and the whole function is
**0-centric**, which is better than sigmoid.

In general, binary classification problems, the tanh function is
used for the hidden layer and the sigmoid function is used for the output
layer. However, these are not static, and the specific activation function to
be used must be analyzed according to the specific problem, or it depends on different
experiments.

The equation of the tanh function id given by:

equation of the tanh function

equation of the tanh function

The graph of the tanh function and its derivative can be shown as:

graph of the tanh function and its derivative

So, writing
a tanh function and its derivative is quite easy. Simply we have to define a
function for the formula. It is implemented as shown below:

```
def tanh_function(z):
return (np.exp(z) - np.exp(-z)) / (np.exp(z) + np.exp(-z))
```

```
def tanh_prime_function(z):
return 1 - np.power(tanh_function(z),2)
```

- Tanh is quite similar to the Y=X function in the vicinity of the origin. When the value of the activation function is low, the matrix operation can be directly performed which makes the training process relatively easier. Both tanh and sigmoid activation functions are fired which makes the neural network heavier.

- Sigmoid function ranges from 0 to 1, but there might be a case where we would like to introduce a negative sign to the output of the artificial neuron. This is where Tanh (hyperbolic tangent function) becomes very useful. Tanh function is almost similar to sigmoid function but the only the main difference is that its output varies from +1 to -1 and Tanh function is centred at zero.

Most of
the times Tanh function is usually used in hidden layers of a neural network because
its values lies between -1 to 1 that’s why the
mean for the hidden layer comes out be 0 or its very close to 0, hence tanh
functions helps in centering the data by
bringing mean close to 0 which makes learning for the next layer much easier. So,
tanh function is useful.

- For tanh activation function the gradient is stronger as compared to sigmoid function.

- Tanh also has the vanishing gradient problem similar to the sigmoid function.

```
# import libraries
import matplotlib.pyplot as plt
import numpy as np
#creating a tanh function
def tanh(x):
a=(np.exp(x)-np.exp(-x))/(np.exp(x)+np.exp(-x))
dt=1-t**2
return a,da
b=np.arange(-4,4,0.01)
tanh(b)[0].size,tanh(b)[1].size
# Setup centered axes
fig, ax = plt.subplots(figsize=(9, 5))
ax.spines['left'].set_position('center')
ax.spines['bottom'].set_position('center')
ax.spines['right'].set_color('none')
ax.spines['top'].set_color('none')
ax.xaxis.set_ticks_position('bottom')
ax.yaxis.set_ticks_position('left')
# Create and show plot
ax.plot(b,tanh(b)[0], color="#307EC7", linewidth=3, label="tanh")
ax.plot(b,tanh(b)[1], color="#9621E2", linewidth=3, label="derivative")
ax.legend(loc="upper right", frameon=false)
fig.show()
```

The plot shown below is the output
of the above code which plots the tanh and its derivative function.

utput of the above code which plots the tanh and its derivative function

I hope you enjoyed reading this article and finally, you came
to know about **Tanh Activation Function and its implementation using python.**

For more such blogs/courses on data science, machine
learning, artificial intelligence and emerging new technologies do visit us at InsideAIML.

Thanks for reading…

Happy Learning…

We're Online!

Chat now for any query