In 2017, Lee et al. showed that these constructs are equivalent to Gaussian processes in the limit of infinite width. The theoretical foundation for the usefulness of these constructs was established by the universal approximation theorem. Kaiming initialization uses a zero-centered Gaussian process with 2 over n variance when constructing these things. The Adam optimizer can be used on these constructs. Max pooling can be used to reduce the dimensions of feature maps for the (*) convolutional type of these constructs. These constructs are trained by updating their weights via backpropagation using gradient descent. Early version of these constructs that have no hidden layers are called Perceptrons. For 10 points, name these constructs used in deep learning and AI, modeled on the biological brain. ■END■
ANSWER: artificial neural networks [or convolutional neural networks; prompt on perceptrons before mentioned]
<Leo Law, Other Science>
= Average correct buzz position