joint entropy
Joint entropy of two variables \(X\) and \(Y\) is
The lower bound of joint entropy is
Which means that you cannot reduce joint entropy by adding another unknown variable, the joint entropy will always jump up to the maximum entropy at least.
An upper bound of joint entropy is
Equality is achieved when they are independent, implying that the information revealed by doing two experiments together is exactly equal to the information of doing each experiment individually. If they happened to be dependent, then there's some extra information in the system and so entropy may be lower (we need fewer bits to express all the information perfectly).
Finally, joint entropy is closely related to conditional entropy, in that
The information from performing a two experiments togther is exactly the information from \(Y\) given we see \(X\) and the information from doing \(X\), recalling the additivity of entropy. If they are actually independent, then the conditional goes to just \(H(Y)\) and we have the same result as above.