# Conditional entropy

This online calculator calculates entropy of Y random variable conditioned on X random variable and X random variable conditioned on Y random variable given a joint distribution table (X, Y) ~ p

The conditional entropy ** H(Y|X)** is the amount of information needed to describe the outcome of a random variable

**given that the value of another random variable**

*Y***is known.**

*X*In order to calculate the conditional entropy we need to know joint distribution of ** X** and

**. Below you should enter the matrix where the cell value for any**

*Y***row and**

*i***column represents the probability of the outcome, . Rows represent the values of X random variable and columns - the values of Y random variable .**

*j*Note that you can click "Show details" to view details of the calculation. Formula used in calculation is explained below the calculator.

### Conditional entropy formula

The conditional entropy of ** Y** given

**is defined as**

*X*It is assumed that the expressions and should be treated as being equal to zero.

for each row is calculated by summing the row values (that is, summing cells for each value of ** X** random variable), and are already given by the input matrix.

What is the meaning of this formula?

In fact, it is the weighted average of specific conditional entropies over all possible values of ** X**.

Specific Conditional Entropy of ** Y** for the

**taking the value**

*X***is the entropy of**

*v***among only those outcomes in which**

*Y***has value**

*X***. That is,**

*v*So, the conditional entropy as the weighted sum of specific conditional entropies for each possible value of ** X**, using

**as the weights, is**

*p(x)*
## Comments