Question:medium

Consider the two neural networks (NNs) shown in Figures 1 and 2, with ReLU activation (ReLU(z) = max{0, z}, ∀z ∈ R). R denotes the set of real numbers. The connections and their corresponding weights are shown in the Figures. The biases at every neuron are set to 0. For what values of p, q, r in Figure 2 are the two NNs equivalent, when x1, x2, x3 are positive ?
Figure 1
Figure 2

Updated On: Nov 25, 2025
  • p = 36, q = 24,r = 24
  • p = 24, q = 24,r = 36
  • p = 18, q = 36,r = 24
  • p = 36, q = 36,r = 36
Hide Solution

The Correct Option is A

Solution and Explanation

The validated choice is (A), with the values p = 36, q = 24, and r = 24.
Was this answer helpful?
1


Questions Asked in GATE AR exam