Question

In 2017, Lee et al. showed that these constructs are equivalent to Gaussian processes in the limit of infinite width. The theoretical foundation (15[1])for the usefulness of these constructs was established by the Universal approximation theorem. Kaiming initialization (15[1])uses a zero-centered Gaussian process with 2 over n variance when constructing these things. The Adam optimizer can be used on these constructs. Max pooling (15[1])can be used to reduce the dimensions (15[1])of feature maps for the (*) convolutional type of these constructs. (10[1])These constructs are trained (10[1])by updating (10[1])their weights (10[1])via backpropagation using gradient descent. Perceptrons are an early version of these constructs that have no hidden layers. For 10 points, name these constructs used in deep learning and AI, modeled on the biological brain. (10[1])■END■

ANSWER: artificial neural networks [or convolutional neural networks; prompt on perceptrons before mentioned]
<Leo Law, Other Science>
= Average correct buzz position

Buzzes

PlayerTeamOpponentBuzz PositionValue
Cade ReinbergerRITPitt2215
Geoffrey WuColumbia BCornell A3715
David BassJHU ARutgers6215
Nathan SheffieldMITBrandeis6915
Jacob Hardin-BernhardtNYUCornell B7910
Joy AnHarvardTufts8310
Isaac MamelUMD AJHU B8510
Alex ShiSwarthmoreJohn Jay A8710
Mason YuBrownBoston College12210

Summary

2023 Penn Bowl (Harvard)10/21/2023Y3100%33%0%91.33
2023 Penn Bowl (Mainsite)10/21/2023Y6100%50%0%62.00
2023 Penn Bowl @ Waterloo10/28/2023N4100%50%25%82.50
2023 Penn Bowl @ FSU10/28/2023N2100%0%0%83.00
2023 Penn Bowl (Norcal)10/28/2023N2100%100%0%52.00
2023 Penn Bowl (South Central)10/28/2023N3100%0%0%86.67
2023 Penn Bowl (UK)10/28/2023N5100%0%0%86.60