EC > GATE 2016 SET-1 > Information Theory
Consider a discrete memoryless source with alphabet S={s0,s1,s2,s3,s4,...} and respective probabilities of occurrence P={1/2,1/4,1/8,1/16,1/32,...}. The entropy of the source (in bits) is
Correct : 2
Similar Questions
The random variable X takes values in {-1,0,1} with probabilities P(X=-1)=P(X=1)=α and P(X=0)=1-2α, where 0
X and Y are Bernoulli random variables taking values in {0,1}. The joint probability mass function of the random variables is given by:P(X=1,Y=1)=0.56P(X=0,Y=0)...
Let H(X) denote the entropy of a discrete random variable X taking K possible distinct real values. Which of the following statements is/are necessarily true?
Total Unique Visitors
Loading......