Skip to content
Related Articles
Get the best out of our app
GeeksforGeeks App
Open App
geeksforgeeks
Browser
Continue

Related Articles

UGC-NET | UGC NET CS 2015 Dec – III | Question 49

Improve Article
Save Article
Like Article
Improve Article
Save Article
Like Article

Consider the conditional entropy and mutual information for the binary symmetric channel. The input source has alphabet X = {0, 1} and associated probabilities {1 / 2, 1 / 2}. The channel matrix is
where p is the transition probability. Then the conditional entropy is given by:

(A) 1
(B) – plog(p) – (1 – p)log(1 – p)
(C) 1 + p log(p) + (1 – p)log(1 – p)
(D) 0


Answer: (B)

Explanation:

Quiz of this Question

My Personal Notes arrow_drop_up
Last Updated : 09 May, 2018
Like Article
Save Article
Similar Reads