EC > GATE 2017 SET-1 > Information Theory
Let (X1,X2) be independent random variables. X1 has mean 0 and variance 1, while X2 has mean 1 and variance 4. The mutual information I(X1;X2) between X1 and X2 in bits is
Correct : 0
Similar Questions
The random variable X takes values in {-1,0,1} with probabilities P(X=-1)=P(X=1)=α and P(X=0)=1-2α, where 0
X and Y are Bernoulli random variables taking values in {0,1}. The joint probability mass function of the random variables is given by:P(X=1,Y=1)=0.56P(X=0,Y=0)...
Let H(X) denote the entropy of a discrete random variable X taking K possible distinct real values. Which of the following statements is/are necessarily true?
Total Unique Visitors
Loading......