Skip to content
Related Articles

Related Articles

Improve Article
UGC-NET | UGC NET CS 2015 Dec – III | Question 49
  • Last Updated : 09 May, 2018

Consider the conditional entropy and mutual information for the binary symmetric channel. The input source has alphabet X = {0, 1} and associated probabilities {1 / 2, 1 / 2}. The channel matrix is
where p is the transition probability. Then the conditional entropy is given by:

(A) 1
(B) – plog(p) – (1 – p)log(1 – p)
(C) 1 + p log(p) + (1 – p)log(1 – p)
(D) 0


Answer: (B)

Explanation:

Quiz of this Question

My Personal Notes arrow_drop_up
Recommended Articles
Page :