Question

In 2017, Lee et al. showed that these constructs are equivalent to Gaussian processes in the limit of infinite width. The theoretical foundation (15[1])for the usefulness of these constructs was established by the Universal approximation theorem. Kaiming initialization (15[1])uses a zero-centered Gaussian process with 2 over n variance when constructing these things. The Adam optimizer can be used on these constructs. Max pooling (15[1])can be used to reduce the dimensions of feature maps for the (*) convolutional type of these constructs. (10[1])These constructs are trained by updating (10[1])their weights (10[1])via backpropagation using gradient descent. Perceptrons are an early version of these constructs that have no hidden layers. For 10 points, name these constructs used in deep learning and AI, modeled on the biological brain. ■END■

ANSWER: artificial neural networks [or convolutional neural networks; prompt on perceptrons before mentioned]
<Leo Law, Other Science>
= Average correct buzz position

Back to tossups

Buzzes

PlayerTeamOpponentBuzz PositionValue
Cade ReinbergerRITPitt2215
Geoffrey WuColumbia BCornell A3715
David BassJHU ARutgers6215
Jacob Hardin-BernhardtNYUCornell B7910
Isaac MamelUMD AJHU B8510
Alex ShiSwarthmoreJohn Jay A8710

Summary

2023 Penn Bowl (Harvard)10/21/2023Y3100%33%0%91.33
2023 Penn Bowl (Mainsite)10/21/2023Y6100%50%0%62.00