Question:medium

Which activation function is zero-centered and ranges between \(-1\) and \(1\)?

Show Hint

{Sigmoid vs Tanh:} Sigmoid outputs values between \(0\) and \(1\), while {Tanh outputs values between \(-1\) and \(1\)} and is zero-centered, which often leads to faster convergence in neural networks.
Updated On: Mar 16, 2026
  • Sigmoid
  • ReLU
  • Tanh
  • Softmax
Show Solution

The Correct Option is C

Solution and Explanation

Step 1: Understanding the Question:
The question asks to identify a specific activation function based on two properties: its output range is \((-1, 1)\) and its output is centered around zero.
Step 2: Detailed Explanation:
Let's examine the properties of each activation function listed in the options:

Sigmoid: The sigmoid function, \( \sigma(x) = \frac{1}{1+e^{-x}} \), has an output range of \( (0, 1) \). It is not zero-centered; its outputs are always positive.

ReLU (Rectified Linear Unit): The ReLU function, \( f(x) = \max(0, x) \), has an output range of \( [0, \infty) \). It is not zero-centered.

Tanh (Hyperbolic Tangent): The Tanh function, \( \tanh(x) = \frac{e^x - e^{-x}}{e^x + e^{-x}} \), has an output range of \( (-1, 1) \). Also, \( \tanh(0) = 0 \), which means its output is centered around zero. This matches the question's criteria.

Softmax: The Softmax function is typically used in the output layer for multi-class classification. It converts a vector of numbers into a probability distribution, where each value is in the range \( (0, 1) \) and the sum of all values is 1. It is not zero-centered in the way Tanh is.

Step 3: Final Answer:
The Tanh function is the only one among the options that is both zero-centered and has an output range between \(-1\) and \(1\).
Was this answer helpful?
0