Open In App

UGC-NET | UGC NET CS 2015 Dec – III | Question 49

Consider the conditional entropy and mutual information for the binary symmetric channel. The input source has alphabet X = {0, 1} and associated probabilities {1 / 2, 1 / 2}. The channel matrix is
where p is the transition probability. Then the conditional entropy is given by:

(A) 1
(B) – plog(p) – (1 – p)log(1 – p)
(C) 1 + p log(p) + (1 – p)log(1 – p)
(D) 0

Answer: (B)
Explanation:
Quiz of this Question

Article Tags :