## Existence of moments-> Existence of distribution?

Hi all,

this might come to you as a bit silly, because normally we are used to the vice-versa question. But here is what I have: a nonlinear time-series model, for which I can derive by infinite backwards iteration the mean and a limiting bound for variance. Now I just want to say that therefore a stationary distribution must exist.

Is this true? Any deep reason why?

[Of course moments are derived in terms of the distribution in the first place, but is this the argument?]

 Tags measure theory, stationarity, statistics, stochastics, time series