Sackler Centre for Consciousness Science

Pre-prints

RELAXING THE CONSTRAINTS ON PREDICTIVE CODING MODELSFeatured pre-printtwo algorithms

Predictive coding approximates backdrop along arbitrary computation graphs

Millidge, B., Tschantz, A., Buckley, C.

Abstract

Backpropagation is the algorithm used to train the deep neural network architectures used in modern machine learning. However, the brain is generally not thought to be able to implement the backpropagation algorithm directly. We show that predictive coding, an influential theory of cortical neuroscience, can approximate the backpropagation algorithm for any computation using only local learning rules, which could in theory be translated into neural circuits. We use this to construct predictive coding variants of standard machine learning architectures.