Open In App

UGC-NET | UGC NET CS 2015 Dec – III | Question 49

Like Article
Like
Save
Share
Report

Consider the conditional entropy and mutual information for the binary symmetric channel. The input source has alphabet X = {0, 1} and associated probabilities {1 / 2, 1 / 2}. The channel matrix is
where p is the transition probability. Then the conditional entropy is given by:

(A) 1
(B) – plog(p) – (1 – p)log(1 – p)
(C) 1 + p log(p) + (1 – p)log(1 – p)
(D) 0


Answer: (B)

Explanation:

Quiz of this Question


Last Updated : 09 May, 2018
Like Article
Save Article
Previous
Next
Share your thoughts in the comments
Similar Reads