Question:medium

Consider a long straight wire of a circular cross-section (radius \( a \)) carrying a steady current \( I \). The current is uniformly distributed across this cross-section. The distances from the centre of the wire's cross-section at which the magnetic field (inside the wire, outside the wire) is half of the maximum possible magnetic field, anywhere due to the wire, will be:

Show Hint

For a uniformly current-carrying wire, the magnetic field inside increases linearly with the radial distance, and outside it decreases with distance. Use Ampère’s Law to derive such relationships.
Updated On: Jan 31, 2026
  • \( \left[ \frac{a}{2}, 3a \right] \)
  • \( \left[ \frac{a}{2}, 2a \right] \)
  • \( \left[ \frac{a}{4}, 2a \right] \)
  • \( \left[ \frac{a}{4}, \frac{3a}{2} \right] \)
Show Solution

The Correct Option is C

Solution and Explanation

The magnetic field within a current-carrying wire exhibits a linear increase as the distance from the center increases. The magnetic field reaches its maximum at the wire's surface. This maximum field strength is found at half the distance from the center when the radius is \( \frac{a}{4} \) within the wire and at \( 2a \) outside the wire.Consequently, the relevant range is \( \left[ \frac{a}{4}, 2a \right] \).
Was this answer helpful?
0