Do neural networks use efficient coding?

My question concerns the relationship between the efficient coding hypothesis which is outlined on the Wikipedia page on efficient coding and neural network learning algorithms.

What is the relationship between the efficient coding hypothesis and neural networks?

Are there any neural network models explicitly inspired by the efficient coding hypothesis?

Or would it be fairer to say that all neural network learning algorithms are at least implicitly based on efficient coding?


I believe that one can argue that a connection has been made. I’ll apologize for not posting my source as I couldn’t find it, but this came from an old slide that Hinton presented. In it, he claimed that one of the fundamental ways of thinking for those who do machine learning (as the presentation predated the common use of the word deep learning) was that there exists an optimal transformation of the data such that the data can be easily learned. I believe for neural nets, the ‘optimal transformation’ of the data though back prop, IS the efficient coding hypothesis in action. In the same way that given a proper kernel, many spaces can be easily classified with linear models, learning the proper way to transform and store the data IS analogous to which and how the neurons should be arranged to represent the data.

Source : Link , Question Author : Mike NZ , Answer Author : Anonymous Emu

Leave a Comment