I was reading this book about deep learning by Ian and Aron. In the description of DBN they says DBN has fallen out of favor and is rarely used.
Deep belief networks demonstrated that deep architectures can be
successful, by outperforming kernelized support vector machines on the
MNIST dataset ( Hinton et al. , 2006 ). Today, deep belief networks
have mostly fallen out of favor and are rarely used, even compared to
other unsupervised or generative learning algorithms, but they are
still deservedly recognized for their important role in deep learning
I don’t understand why.
Remember that backpropagation used to come with one big problem; the vanishing gradient; I think the main reason for what deep belief networks are rarely used is because backpropagation used with RELU (Rectified Linear Unit) solves the vanishing gradient problem and it is not an issue anymore and you don´t need to implement a DBN.
The second reason is because even though you could resolve the same problem using similar approaches, big deep networks architectures become way more complex to train with deep belief networks. Using backpropagation with RELU you can train in one shot.