1. Home
  2. questions

Filters

Found 3 Questions

Set Default

List of practice Questions

The random variable \( X \) takes values in \( \{-1, 0, 1\} \) with probabilities \[ P(X = -1) = P(X = 1) = \alpha \quad {and} \quad P(X = 0) = 1 - 2\alpha, \quad 0<\alpha<\frac{1}{2}. \] Let \( g(\alpha) \) denote the entropy of \( X \) (in bits), parameterized by \( \alpha \). Which of the following statements is/are TRUE?
  • GATE EC - 2025
  • GATE EC
  • Communication Systems
  • Information Theory
X and Y are Bernoulli random variables taking values in \( \{0,1\} \). The joint probability mass function of the random variables is given by:
P(X = 0, Y = 0) = 0.06, \( \quad P(X = 0, Y = 1) = 0.14 \), \( \quad P(X = 1, Y = 0) = 0.24 \), \( \quad P(X = 1, Y = 1) = 0.56 \).
The mutual information \( I(X; Y) \) is (rounded off to two decimal places).
  • GATE EC - 2025
  • GATE EC
  • Communication Systems
  • Information Theory

X and Y are Bernoulli random variables taking values in \( \{0,1\} \). The joint probability mass function of the random variables is given by: 
P(X = 0, Y = 0) = 0.06, P(X = 0, Y = 1) = 0.14, P(X = 1, Y = 0) = 0.24, P(X = 1, Y = 1) = 0.56. 
The mutual information \( I(X; Y) \) is (rounded off to two decimal places).

  • GATE EC - 2025
  • GATE EC
  • Communication Systems
  • Information Theory
contact us
terms & conditions
Privacy & Policy
© 2026 Patronum Web Private Limited