EC > GATE 2013 SET-4 > Information Theory
Let U and V be two independent and identically distributed random variables such that P(U = +1) = P(U = -1) = 1/2. The entropy H(U + V) in bits is
Correct : c
Similar Questions
The random variable X takes values in {-1,0,1} with probabilities P(X=-1)=P(X=1)=α and P(X=0)=1-2α, where 0
X and Y are Bernoulli random variables taking values in {0,1}. The joint probability mass function of the random variables is given by:P(X=1,Y=1)=0.56P(X=0,Y=0)...
Let H(X) denote the entropy of a discrete random variable X taking K possible distinct real values. Which of the following statements is/are necessarily true?
Total Unique Visitors
Loading......