Question

The channel capacity maximized between input and output channels is equal to this quantity. For 10 points each:
[10h] Identify this quantity that measure how much observing one random variable informs you about another random variable. For two random variables X and Y, this quantity is equal to the sum of p(X,Y) ["p-of-x-y"] times log of P(X,Y) over P(X) ["p-of-x"] times P(Y) ["p-of-y"].
ANSWER: mutual information [prompt on information]
[10m] Mutual information is shown to be non-negative using this statement, which states that for a convex function phi, phi of the expectation value of X is less than or equal to the expectation value of phi of X.
ANSWER: Jensen’s inequality [or Jensen inequality]
[10e] Two answers required. The most commonly used unit for mutual information is a bit, the entropy for a variable that can be either of these two numbers. Binary representation of numbers uses these two digits to express any number.
ANSWER: 0 and 1
<Leo Law, Other Science>

Back to bonuses

Summary

2023 Penn Bowl @ Waterloo10/28/2023Y120.00100%100%0%
2023 Penn Bowl (Harvard)10/21/2023Y110.00100%0%0%
2023 Penn Bowl (UK)10/28/2023Y520.00100%80%20%
2023 Penn Bowl @ UNC10/28/2023Y130.00100%100%100%

Data

Toronto WearyLibrary of Babel School of Continuing Studies0101020
MITBrown001010
Four Neg Omelette3HK1MM001010
EdinburghBroken Hearts0101020
Betrayed by Rita IzzatdustTabearnacle10101030
Old London TownYes, Moderator0101020
Scottish, Irish, Both or Neither10 Negs that Shook the Quiz0101020
UNC AUNC B10101030