To solve this problem, we need to find the shortest distance from the point (5, 5) to a circle that touches two lines parallel to the coordinate axes and passes through the point (2, 3). Below is a step-by-step explanation:
The two lines that are parallel to the coordinate axes and pass through the point (2, 3) are:
The circle of unit radius (radius = 1) touches both of these lines and lies on the side of the origin. Since the circle is tangent to the lines, the center of the circle must be 1 unit away from each. Therefore, the center of the circle is:
\((2 - 1, 3 - 1) = (1, 2)\)
With the circle centered at (1, 2) and having a radius of 1, we now need to find the distance from the point (5, 5) to the center (1, 2) of this circle.
The distance \(d\) between two points \((x_1, y_1)\) and \((x_2, y_2)\) is given by the formula:
\(d = \sqrt{(x_2 - x_1)^2 + (y_2 - y_1)^2}\)
Substituting the given points, we have:
\(d = \sqrt{(5 - 1)^2 + (5 - 2)^2} = \sqrt{4^2 + 3^2} = \sqrt{16 + 9} = \sqrt{25} = 5\)
The shortest distance from the point (5, 5) to the circle is the distance from the point (5, 5) to the center (1, 2), minus the radius of the circle:
\(\text{Shortest Distance} = 5 - 1 = 4\)
Thus, the shortest distance from the point (5, 5) to the circle is 4.