Question

The Lottery Ticket Hypothesis attempts to explain these constructs’ efficiency despite their low VC dimension. Proving that a sequence of these constructs converges to any arbitrary function is the goal of several Universal Approximation (10[1])Theorems. One type of these constructs passes a 2D “mask” over values to compute an activation map. The LSTM was created to alleviate a problem with these constructs where a value “vanishes” (10[1])due to repeatedly applying the chain rule. Images can be processed using the “convolutional” type of these (10[1])constructs. The weights (10[2])of these constructs are typically updated using backpropagation, which was popularized by 2024 (-5[1])Nobel Laureate Geoffrey Hinton. For 10 points, “deep learning” (10[1])uses what biologically-inspired computational (10[1])constructs? ■END■

ANSWER: artificial neural networks [or deep neural networks or ANNs or DNNs; accept specific types of neural networks such as convolutional neural networks or recurrent neural networks or CNNs or RNNs; accept “The Lottery Ticket Hypothesis: Finding Sparse, Trainable Neural Networks”; prompt on classifiers or learners or machine learning models or large language models or LLMs; prompt on artificial neurons; reject “biological neural networks”]
<Other Science>
= Average correct buzz position

Back to tossups

Buzzes

PlayerTeamOpponentBuzz PositionValue
Geoffrey WuColumbia APenn B3310
David BassJohns Hopkins ARutgers A6510
Jasin CekinmezPrinceton ARutgers C8210
Robert WangPenn State BRowan A8510
Robert WangPenn State BRowan A8510
Chase BarrickLehigh AJohns Hopkins B98-5
Teigue KellyPenn State ABard A10710
Maximilian NieburJohns Hopkins BLehigh A11110
Austin GuoPrinceton AJohns Hopkins B12810