Question:medium

A source transmits symbols from an alphabet of size 16. The value of maximum achievable entropy (in bits) is____.

Show Hint

Maximum entropy occurs when all symbols in the alphabet are equally likely. Use \(\log_2 N\) to calculate maximum entropy for a source with \(N\) symbols.
Updated On: Feb 12, 2026
Show Solution

Correct Answer: 4

Solution and Explanation

The entropy of a source is a measure of the uncertainty or information content associated with a random variable. For a source transmitting symbols from an alphabet of size \( n \), the maximum entropy \( H_{\text{max}} \) can be calculated using the formula:

\( H_{\text{max}} = \log_2 n \)

In this problem, the alphabet size is 16. Substituting this value into the formula:

\( H_{\text{max}} = \log_2 16 \)

Knowing that \( 16 = 2^4 \), we can simplify the expression:

\( H_{\text{max}} = 4 \)

The computed maximum entropy is 4 bits. Checking the provided range (4,4), we confirm that the calculated value lies within the expected range.
Was this answer helpful?
0

Top Questions on Impedance transformation