In climate modelling, you’re looking for a models that can adequately portray the Earth’s climate. This includes showing patterns that are semi-cyclical: things like the El Nino Southern Oscillation. But model verification occurs generally over relatively short time periods, where there is decent observational data (last ~150 years). This means that your model could be displaying the right patterns, but be out of phase, such that linear comparisons, like correlation, will not pick up that the model is performing well..

Discrete Fourier transforms are commonly used to analyse climate data (here’s an example), in order to pick up such cyclic patterns. Is there any standard measure of the similarity of two DFTs, that could be used as a verification tool (ie. a comparison between the DFT for the model, and the one for the observations)?

Would it make sense to take the integral of the minimum of the two area-normalised DFTs (using absolute real values)? I think this would result in a score $x\in[0,1]$, where $x=1\implies$ exactly the same patterns, and $x=0\implies$totally different patterns. What might the drawbacks of such a method be?

**Answer**

Spectral coherence, if used correctly would do it. Coherence is computed at each frequency-and hence is a vector. Hence, a sum of a weighted coherence would be a good measure. You would typically want to weight the coherences at frequencies that have a high energy in the power spectral density. That way, you would be measuring the similarities at the frequencies that dominate the time series instead of weighting the coherence with a large weight, when the content of that frequency in the time series is negligible.

So, in simple words- the basic idea is to find the frequencies at which the amplitude(energy) in the signals are high(interpret as the frequencies that dominantly constitute each signal) and then to compare the similarities at these frequencies with a higher weight and compare the signals at the rest of the frequencies with a lower weight.

The area which deals with questions of this kind is called cross-spectral analysis.

http://www.atmos.washington.edu/~dennis/552_Notes_6c.pdf is an excellent introduction to cross-spectral analysis.

Optimal Lag:

Also look at my answer over here: How to correlate two time series, with possible time differences

This deals finding the optimal lag, using the spectral coherence. R has functions to compute the power spectral densities, auto and cross correlations, Fourier transforms and coherence. You have to right code to find the optimal lag to obtain the max. weighted coherence. That said, a code for weighting the coherence vector using the spectral density must also be written. Following which you can sum up the weighted elements and average it to get the similarity observed at the optimal lag.

**Attribution***Source : Link , Question Author : naught101 , Answer Author : Community*