Can someone explain the main idea behind Hamiltonian Monte Carlo methods and in which cases they will yield better results than Markov Chain Monte Carlo methods ?
I believe the most up-to-date source on Hamiltonian Monte Carlo, its practical applications and comparison to other MCMC methods is this 2017-dated review paper by Betancourt:
The ultimate challenge in estimating probabilistic expectations is quantifying the typical set of the target distribution, a set which concentrates near a complex surface in parameter space. Hamiltonian Monte Carlo generates coherent exploration of smooth target distributions by exploiting the geometry of the typical set. This effective exploration yields not only better computational efficiency than other Markov chain Monte Carlo algorithms, but also stronger guarantees on the validity of the resulting estimators. Moreover, careful analysis
of this geometry facilitates principled strategies for automatically constructing optimal implementations of the method, allowing users to focus their expertise into building better models instead of wrestling with the frustrations of statistical computation. As a result, Hamiltonian Monte Carlo is uniquely suited to tackling the most challenging problems at the frontiers of applied statistics, as demonstrated by the huge success of tools like Stan (Stan Development Team, 2017).