This theory provides a plausible biological alternative to backpropagation since it’s fully local and allows computation and learning to take place simultaneously. For 10 points each:
[10h] Name this theory formalized in a 1999 paper by Rao and Bellard. This theory holds that the brain refines a “mental model” by minimizing some difference measure, like free energy, error, or a “sensory residual.”
ANSWER: predictive coding [or predictive processing]
[10m] With R. S. Zemel, this scientist showed that neural networks can learn by using Helmholtz free energy to represent the prediction error. This scientist’s development of the Boltzmann machine earned him half of the 2024 Physics Nobel.
ANSWER: Geoffrey Hinton
[10e] Local training algorithms, like those in predictive coding and Boltzmann machines, use Hebbian learning to replicate this property. This property is the brain’s ability to “rewire” itself.
ANSWER: synaptic plasticity [or neuroplasticity; accept Hebbian plasticity]
<FW, Other Science>