Let \(\mathbb{R}\) and \(\mathbb{R}^3\) denote the set of real numbers and the three-dimensional vector space over it, respectively. The value of \(\alpha\) for which the set of vectors:
\[
\{[2 \,\, -3 \,\, \alpha], \, [3 \,\, -1 \,\, 3], \, [1 \,\, -5 \,\, 7]\}
\]
does not form a basis of \(\mathbb{R}^3\) is \(\_\_\_\_\).
Show Hint
To verify if vectors form a basis in \( \mathbb{R}^3 \), calculate the determinant of the associated matrix. A zero determinant indicates linear dependence.
Step 1: Basis condition in \( \mathbb{R}^3 \).
A set of vectors forms a basis for \( \mathbb{R}^3 \) if the vectors are linearly independent. This condition is satisfied if the determinant of the matrix formed by these vectors is non-zero.
Step 2: Form the matrix.
Arrange the given vectors as rows (or columns) of a matrix:
\[
A =
\begin{bmatrix}
2 & -3 & \alpha \\
3 & -1 & 3 \\
1 & -5 & 7
\end{bmatrix}.
\]
Step 4: Determine when the vectors are dependent.
The determinant equals zero when the vectors are linearly dependent:
\[
70 - 14\alpha = 0.
\]
Solving for \(\alpha\):
\[
\alpha = 5.
\]
The vectors fail to form a basis of \( \mathbb{R}^3 \) when \( \alpha = 5 \).