The entropy of a source is a measure of the uncertainty or information content associated with a random variable. For a source transmitting symbols from an alphabet of size \( n \), the maximum entropy \( H_{\text{max}} \) can be calculated using the formula:
\( H_{\text{max}} = \log_2 n \)
In this problem, the alphabet size is 16. Substituting this value into the formula:
\( H_{\text{max}} = \log_2 16 \)
Knowing that \( 16 = 2^4 \), we can simplify the expression:
\( H_{\text{max}} = 4 \)
The computed maximum entropy is 4 bits. Checking the provided range (4,4), we confirm that the calculated value lies within the expected range.