Question

In 2017, Lee et al. showed that these constructs are equivalent to Gaussian processes in the limit of infinite width. The theoretical foundation for the usefulness of these constructs was established by the universal approximation theorem. Kaiming initialization uses a zero-centered Gaussian process with 2 over n variance when constructing these things. The Adam optimizer can be used on (15[1])these constructs. (-5[1])Max pooling can be used to reduce the dimensions of feature (15[1])maps for the (*) convolutional type of these (10[1])constructs. These constructs are trained by updating their weights via backpropagation using gradient descent. Early version of these constructs that have no hidden layers are called Perceptrons. For 10 points, name these constructs used in deep learning and AI, modeled on the biological brain. ■END■ (10[1])

ANSWER: artificial neural networks [or convolutional neural networks; prompt on perceptrons before mentioned]
<Leo Law, Other Science>
= Average correct buzz position

Back to tossups

Buzzes

PlayerTeamOpponentBuzz PositionValue
Kais JessaLibrary of Babel School of Continuing StudiesMixed-Affiliated Contingency, Off the Team & Absent: Wong, Adrian5815
Sky LiToronto WearyWaterloo Miku60-5
Liam KusalikWaterloo HatsuneLa Clique du Château7115
Jason ZhangToronto RoflToronto Joy7810
Caleb OttWaterloo MikuToronto Weary12310

Summary

2023 Penn Bowl @ Waterloo10/28/2023Y4100%50%25%82.50
2023 Penn Bowl @ FSU10/28/2023Y2100%0%0%83.00
2023 Penn Bowl (Norcal)10/28/2023Y2100%100%0%52.00
2023 Penn Bowl (South Central)10/28/2023Y3100%0%0%86.67
2023 Penn Bowl (UK)10/28/2023Y5100%0%0%86.60