Question
Miyato et. al. introduced the idea of virtual adversarial examples to reduce this phenomenon. Violations of the one-in-ten rule leads to higher prevalence of this phenomenon. Michael A. Babyak’s paper “What You See May Not Be What You Get” is a “nontechnical” introduction to this phenomenon, which unintuitively occurs only at intermediate size in double descent. The 2k term in the Akaike information criterion adjusts for this phenomenon. Elastic net reduces this phenomenon by combining the (*) L1 and L2 norm penalties from LASSO and ridge regression. Early stopping is an example of regularization techniques used to mitigate this phenomenon, which is exhibited by models with very low bias and very high variance. For 10 points, name this phenomenon where a model has too many parameters, causing test error to be much larger than training error. ■END■
Buzzes
Player | Team | Opponent | Buzz Position | Value |
---|---|---|---|---|
Natan Holtzman | throw away your cards, rally in the streets | Aw we're so sorry to hear that maman died today, she gets five big booms | 67 | 15 |
Steven Yuan | CLEVELAND, THIS IS FOR YOU! | I wish it were possible to freeze time so I would never have to watch you retire | 87 | 10 |
John Chen | UBC | Thompson et al. | 111 | -5 |
Joel Miles | Thompson et al. | UBC | 135 | 10 |