Question
The channel capacity maximized between input and output channels is equal to this quantity. For 10 points each:
[10h] Identify this quantity that measure how much observing one random variable informs you about another random variable. For two random variables X and Y, this quantity is equal to the sum of p(X,Y) ["p-of-x-y"] times log of P(X,Y) over P(X) ["p-of-x"] times P(Y) ["p-of-y"].
ANSWER: mutual information [prompt on information]
[10m] Mutual information is shown to be non-negative using this statement, which states that for a convex function phi, phi of the expectation value of X is less than or equal to the expectation value of phi of X.
ANSWER: Jensen’s inequality [or Jensen inequality]
[10e] Two answers required. The most commonly used unit for mutual information is a bit, the entropy for a variable that can be either of these two numbers. Binary representation of numbers uses these two digits to express any number.
ANSWER: 0 and 1
<Leo Law, Other Science>
Summary
2023 Penn Bowl @ Waterloo | 10/28/2023 | Y | 1 | 20.00 | 100% | 100% | 0% |
2023 Penn Bowl (Harvard) | 10/21/2023 | Y | 1 | 10.00 | 100% | 0% | 0% |
2023 Penn Bowl (UK) | 10/28/2023 | Y | 5 | 20.00 | 100% | 80% | 20% |
2023 Penn Bowl @ UNC | 10/28/2023 | Y | 1 | 30.00 | 100% | 100% | 100% |
Data
MIT | Brown | 0 | 0 | 10 | 10 |