# UGC-NET | UGC NET CS 2015 Dec – III | Question 49

• Last Updated : 09 May, 2018

Consider the conditional entropy and mutual information for the binary symmetric channel. The input source has alphabet X = {0, 1} and associated probabilities {1 / 2, 1 / 2}. The channel matrix is $\begin{pmatrix}&space;1-p&space;&p&space;\\&space;p&&space;1-p&space;\end{pmatrix}$
where p is the transition probability. Then the conditional entropy is given by:

(A) 1
(B) – plog(p) – (1 – p)log(1 – p)
(C) 1 + p log(p) + (1 – p)log(1 – p)
(D) 0