Non-Linearity

non-linearity in deep learning is important so that the network can learn more complex patterns. The weighted sum calculation which is common is still a linear function. So if that’s all a neuron does, it can’t do much more than a linear regression. However, the true pattern inside data may not be linear.

Introducing non linearity is done by activation functions like sigmoid or ReLU

In deep learning, activation functions are how we achieve nonlinearity. Commonly used ones include: