Step 1: Understanding the Concept:
The "least count" of a measuring instrument is the smallest measurement that can be accurately made with that instrument. A micrometer (or micrometer screw gauge) is a precision instrument used to measure small dimensions with high accuracy. We need to find its standard least count.
Step 2: Key Formula or Approach:
The least count of a micrometer is calculated by the formula:
\[ \text{Least Count} = \frac{\text{Pitch of the screw}}{\text{Number of divisions on the circular (thimble) scale}} \]
- The pitch is the distance the screw moves forward in one complete rotation. For a standard metric micrometer, this is typically 0.5 mm.
- The circular scale (or thimble) is the rotating part with markings. For a standard metric micrometer, it has 50 divisions.
Step 3: Detailed Explanation:
Using the standard values for a metric micrometer:
- Pitch = 0.5 mm
- Number of divisions on circular scale = 50
Let's calculate the least count:
\[ \text{Least Count} = \frac{0.5 \text{ mm}}{50} = 0.01 \text{ mm} \]
This means that for each small division turned on the thimble, the screw advances by 0.01 mm, which is the smallest measurement the instrument can resolve.
For comparison, the least count of a standard vernier caliper is typically 0.02 mm, and the least count of a meter scale is 1 mm. The micrometer is more precise.
Step 4: Final Answer:
The least count of a standard micrometer is 0.01 mm. Therefore, option (A) is the correct answer.