Question

Miyato et. al. introduced the idea of virtual adversarial examples to reduce this phenomenon. Violations of the one-in-ten rule leads to higher prevalence of this phenomenon. Michael A. Babyak’s paper “What You See May Not Be What You Get” is a “nontechnical” introduction to this phenomenon, which unintuitively occurs only at intermediate size in double descent. The 2k term in the Akaike (15[1])information criterion adjusts for this phenomenon. Elastic net reduces this phenomenon by combining the (*) L1 and L2 (10[1])norm penalties from LASSO and ridge regression. Early stopping is an example of regularization (-5[1])techniques used to mitigate this phenomenon, which is exhibited by models with very low bias and very high variance. For 10 points, name this phenomenon where a model has too many parameters, causing test error to be much larger than training error. ■END■ (10[1])

ANSWER: overfitting [prompt on generalization error; prompt on bias–variance tradeoff before “variance”; prompt on answers describing a high number of variables/parameters or high complexity with “which causes what phenomenon?”]
<Science - Other Science - Math>
= Average correct buzz position

Back to tossups

Buzzes

PlayerTeamOpponentBuzz PositionValue
Jaimie CarlsonBanned from ARGOS"Powers a question on Stancyzk" that's a clown question bro6115
Aditya GangradePahkin' the Ahgo|madam|7810
Walter ZhangImport PandasHu up Jinning they Tao92-5
Jonathan SchnipperHu up Jinning they TaoImport Pandas13510

Summary

2024 ARGOS @ Brandeis03/22/2025Y3100%33%33%91.33
2024 ARGOS Online03/22/2025Y3100%33%33%96.33