# What are alternatives to VC-dimension for measuring the complexity of neural networks?

I have come across some basic ways to measure the complexity of neural networks:

Are there other alternatives?

It is preferred:

• If the complexity metric could be used to measure neural networks from different paradigms (to measure backprop, dynamics neural nets, cascade correlation, etc) on the same scale. For instance, VC-dimension can be used for different types on networks (or even things other than neural networks) while number of neurons is only useful between very specific models where the activation function, signals (basic sums vs. spikes), and other properties of the network are the same.
• If it has nice correspondences to standard measures of complexity of functions learnable by the network
• If it is easily to compute the metric on specific networks (this last one is not a must, though.)

### Notes

This question is based on a more general question on CogSci.SE.