Question:medium

Theorem/law and its corresponding specified quality are given in the table below. Match List-I with List-II

Show Hint

Create strong associations: - Shannon Source Coding \(\leftrightarrow\) Entropy/Code Length - Shannon-Hartley \(\leftrightarrow\) Channel Capacity - Wiener-Kintchine \(\leftrightarrow\) Power Spectrum \& Autocorrelation - Dimensionality \(\leftrightarrow\) Time-Bandwidth Product (Signal Space)
Updated On: Feb 18, 2026
  • A - I, B - II, C - III, D - IV
  • A - IV, B - III, C - II, D - I
  • A - I, B - II, C - IV, D - III
  • A - IV, B - II, C - III, D - I
Show Solution

The Correct Option is D

Solution and Explanation

Step 1: Match the following theorems/laws with their corresponding concepts.

(A) Shannon Source Theorem: This theorem, central to data compression and also known as the noiseless coding theorem, posits that the minimum average bits per symbol needed to represent a source equals its entropy. It defines the (IV) Optimum code length.
(B) Dimensionality Theorem: This theorem links a signal's time-bandwidth product to its dimensionality or degrees of freedom, essentially describing the (II) Storage space of a signal. Specifically, a signal with duration T and bandwidth W can be represented by 2WT samples.
(C) Wiener-Kintchine Theorem: This theorem establishes that the (III) Power spectral density of a wide-sense-stationary random process is the Fourier transform of its autocorrelation function.
(D) Shannon-Hartley Law: This law provides the upper limit on the data rate of a communication channel. It defines the (I) Channel capacity (C) based on bandwidth (B) and signal-to-noise ratio (S/N), expressed as: \(C = B \log_2(1 + S/N)\).

Step 2: Solution.The correct matches are: A-IV, B-II, C-III, D-I, corresponding to option (D).
Was this answer helpful?
0