Question

Miyato et. al. introduced the idea of virtual adversarial examples to reduce this phenomenon. Violations of the one-in-ten rule leads to higher prevalence of this phenomenon. Michael A. Babyak’s paper (15[1])“What You See May (15[1])Not Be What You Get” is a “nontechnical” introduction to this phenomenon, which unintuitively occurs only at intermediate size in double descent. The 2k (-5[1])term in the Akaike information criterion adjusts for this phenomenon. Elastic net reduces this phenomenon by combining the (*) L1 and L2 norm penalties from LASSO and ridge regression. Early stopping is an example of regularization techniques used to mitigate this phenomenon, (10[1])which is exhibited by models with very low bias and very high variance. For 10 points, name this phenomenon where a model has too many parameters, causing test error to be much larger than training error. ■END■

ANSWER: overfitting [prompt on generalization error; prompt on bias–variance tradeoff before “variance”; prompt on answers describing a high number of variables/parameters or high complexity with “which causes what phenomenon?”]
<Science - Other Science - Math>
= Average correct buzz position

Back to tossups

Buzzes

PlayerTeamOpponentBuzz PositionValue
Eric ChenWhere are the ACF Nationals recordings?Cry of the Common Loon2915
Tim MorrisonStanford+number of tang poems = 75 times number of lines in a shi = 100 times number of lines in a haiku3315
Eve FleisigBerkeleyA is for Amy Robsart who fell down the stairs57-5
Ian TullisA is for Amy Robsart who fell down the stairsBerkeley9810

Summary

2024 ARGOS @ Stanford02/22/2025Y3100%67%33%53.33
2024 ARGOS @ Chicago11/23/2024Y6100%17%50%103.33
2024 ARGOS @ Columbia11/23/2024Y3100%67%0%71.67
2024 ARGOS @ Christ's College12/14/2024Y3100%0%33%117.67