Question

In 2017, Lee et al. showed that these constructs are equivalent to Gaussian processes in the limit of infinite width. The theoretical foundation for the usefulness of these constructs was established by the universal approximation theorem. Kaiming initialization uses a zero-centered Gaussian process with 2 over n variance when constructing these things. The Adam optimizer can be used on these constructs. Max pooling can be used to reduce the dimensions of feature maps for the (*) convolutional type of these (10[1])constructs. (10[2])These constructs are trained by updating their weights via backpropagation using gradient descent. (10[1])Early version of these constructs that have no hidden layers are called Perceptrons. (10[1])For 10 points, name these constructs used in deep learning and AI, modeled on the biological brain. ■END■

ANSWER: artificial neural networks [or convolutional neural networks; prompt on perceptrons before mentioned]
<Leo Law, Other Science>
= Average correct buzz position

Back to tossups

Buzzes

PlayerTeamOpponentBuzz PositionValue
Omer KeskinBetrayed by Rita Izzatdust3HK1MM7810
Michael KohnFoucault's PenndulumFour Neg Omelette7910
Maxwell YeTabearnacleOld London Town7910
Parth JagtapEdinburghYes, Moderator9210
Daoud JacksonBroken Hearts10 Negs that Shook the Quiz10510

Summary

2023 Penn Bowl @ Waterloo10/28/2023Y4100%50%25%82.50
2023 Penn Bowl @ FSU10/28/2023Y2100%0%0%83.00
2023 Penn Bowl (Norcal)10/28/2023Y2100%100%0%52.00
2023 Penn Bowl (South Central)10/28/2023Y3100%0%0%86.67
2023 Penn Bowl (UK)10/28/2023Y5100%0%0%86.60