Test to distinguish periodic from almost periodic data

Suppose I have some unknown function f with domain , which I know to fulfill some reasonable conditions like continuity. I know the exact values of f (because the data comes from a simulation) at some equidistant sampling points t_i=t_0 + iΔt with i∈\{1,…,n\}, which I can assume to be sufficiently fine to capture all relevant aspects of f, e.g., I can assume that there is at most one local extremum of f in between two sampling points. I am looking for a test that tells me whether my data complies with f being exactly periodic, i.e., ∃τ: f(t+τ)=f(t) \,∀\,t, with the period length being somewhat resonable, for example Δt < τ < n·Δt (but it’s conceivable that I can make stronger constraints, if needed).

From another point of view, I have data {x_0, …, x_n} and am looking for a test that answers the question whether a periodic function f (fulfilling conditions as above) exists such that f(t_i)=x_i ∀ i.

The important point is that f is at least very close to periodicity (it could be for example f(t) := \sin(g(t)·t) or f(t) := g(t)·\sin(t) with g'(t) ≪ g(t_0)/Δt) to the extent that changing one data point by a small amount may suffice to make the data comply with f being exactly periodic. Thus standard tools for frequency analysis such as the Fourier transform or analysing zero crossings will not help much.

Note that the test I am looking for will likely not be probabilistic.

I have some ideas how to design such a test myself but want to avoid reinventing the wheel. So I am looking for an existing test.

Answer

As I said, I had an idea how to do this, which I realised, refined and wrote a paper about, which is now published: Chaos 25, 113106 (2015)preprint on ArXiv.

The investigated criterion is almost the same as sketched in the question: Given data x_1, \ldots, x_n sampled at time points t_0, t_0 + Δt, \ldots, t_0 + nΔt, the test decides whether there is a function f: [t_0, t_0 + Δt] → ℝ and a τ ∈ [2Δt,(n-1)Δt] such that:

  • f(t_0 + iΔt)=x_i\quad \forall i∈\{1,…,n\}
  • f(t+τ)=f(t) \quad∀t∈[t_0, t_0 + Δt-τ]
  • f has no more local extrema than the sequence x, with the possible exception of at most one extremum close to the beginning and end of f each.

The test can be modified to account for small errors, such as numerical errors of the simulation method.

I hope that my paper also answers why I was interested in such a test.

Attribution
Source : Link , Question Author : Wrzlprmft , Answer Author : Wrzlprmft

Leave a Comment