Movatterモバイル変換


[0]ホーム

URL:


Jump to content
WikipediaThe Free Encyclopedia
Search

Stationary process

From Wikipedia, the free encyclopedia
Type of stochastic process

Inmathematics andstatistics, astationary process (also called astrict/strictly stationary process orstrong/strongly stationary process) is astochastic process whose statistical properties, such asmean andvariance, do not change over time. More formally, thejoint probability distribution of the process remains the same when shifted in time. This implies that the process is statistically consistent across different time periods. Because many statistical procedures intime series analysis assume stationarity, non-stationary data are frequently transformed to achieve stationarity before analysis.

A common cause of non-stationarity is a trend in the mean, which can be due to either aunit root or a deterministic trend. In the case of a unit root, stochastic shocks have permanent effects, and the process is notmean-reverting. With a deterministic trend, the process is calledtrend-stationary, and shocks have only transitory effects, with the variable tending towards a deterministically evolving mean. A trend-stationary process is not strictly stationary but can be made stationary by removing the trend. Similarly, processes with unit roots can be made stationary throughdifferencing.

Another type of non-stationary process, distinct from those with trends, is acyclostationary process, which exhibits cyclical variations over time.

Strict stationarity, as defined above, can be too restrictive for many applications. Therefore, other forms of stationarity, such aswide-sense stationarity orN-th-order stationarity, are often used. The definitions for different kinds of stationarity are not consistent among different authors (seeOther terminology).

Strict-sense stationarity

[edit]

Definition

[edit]

Formally, let{Xt}{\displaystyle \left\{X_{t}\right\}} be astochastic process and letFX(xt1+τ,,xtn+τ){\displaystyle F_{X}(x_{t_{1}+\tau },\ldots ,x_{t_{n}+\tau })} represent thecumulative distribution function of theunconditional (i.e., with no reference to any particular starting value)joint distribution of{Xt}{\displaystyle \left\{X_{t}\right\}} at timest1+τ,,tn+τ{\displaystyle t_{1}+\tau ,\ldots ,t_{n}+\tau }. Then,{Xt}{\displaystyle \left\{X_{t}\right\}} is said to bestrictly stationary,strongly stationary orstrict-sense stationary if[1]: p. 155 

FX(xt1+τ,,xtn+τ)=FX(xt1,,xtn)for all τ,t1,,tnR and for all nN>0{\displaystyle F_{X}(x_{t_{1}+\tau },\ldots ,x_{t_{n}+\tau })=F_{X}(x_{t_{1}},\ldots ,x_{t_{n}})\quad {\text{for all }}\tau ,t_{1},\ldots ,t_{n}\in \mathbb {R} {\text{ and for all }}n\in \mathbb {N} _{>0}}Eq.1

Sinceτ{\displaystyle \tau } does not affectFX(){\displaystyle F_{X}(\cdot )},FX{\displaystyle F_{X}} is independent of time.

Examples

[edit]
Two simulated time series processes, one stationary and the other non-stationary, are shown above. Theaugmented Dickey–Fuller (ADF)test statistic is reported for each process; non-stationarity cannot be rejected for the second process at a 5%significance level.

White noise is the simplest example of a stationary process.

An example of adiscrete-time stationary process where the sample space is also discrete (so that the random variable may take one ofN possible values) is aBernoulli scheme. Other examples of a discrete-time stationary process with continuous sample space include someautoregressive andmoving average processes which are both subsets of theautoregressive moving average model. Models with a non-trivial autoregressive component may be either stationary or non-stationary, depending on the parameter values, and important non-stationary special cases are whereunit roots exist in the model.

Example 1

[edit]

LetY{\displaystyle Y} be any scalarrandom variable, and define a time-series{Xt}{\displaystyle \left\{X_{t}\right\}} by

Xt=Y for all t.{\displaystyle X_{t}=Y\qquad {\text{ for all }}t.}

Then{Xt}{\displaystyle \left\{X_{t}\right\}} is a stationary time series, for which realisations consist of a series of constant values, with a different constant value for each realisation. Alaw of large numbers does not apply on this case, as the limiting value of an average from a single realisation takes the random value determined byY{\displaystyle Y}, rather than taking theexpected value ofY{\displaystyle Y}.

The time average ofXt{\displaystyle X_{t}} does not converge since the process is notergodic.

Example 2

[edit]

As a further example of a stationary process for which any single realisation has an apparently noise-free structure, letY{\displaystyle Y} have auniform distribution on[0,2π]{\displaystyle [0,2\pi ]} and define the time series{Xt}{\displaystyle \left\{X_{t}\right\}} by

Xt=cos(t+Y) for tR.{\displaystyle X_{t}=\cos(t+Y)\quad {\text{ for }}t\in \mathbb {R} .}

Then{Xt}{\displaystyle \left\{X_{t}\right\}} is strictly stationary since ((t+Y){\displaystyle (t+Y)} modulo2π{\displaystyle 2\pi }) follows the same uniform distribution asY{\displaystyle Y} for anyt{\displaystyle t}.

Example 3

[edit]

Keep in mind that aweakly white noise is not necessarily strictly stationary. Letω{\displaystyle \omega } be a random variable uniformly distributed in the interval(0,2π){\displaystyle (0,2\pi )} and define the time series{zt}{\displaystyle \left\{z_{t}\right\}}

zt=cos(tω)(t=1,2,...){\displaystyle z_{t}=\cos(t\omega )\quad (t=1,2,...)}

Then

E(zt)=12π02πcos(tω)dω=0,Var(zt)=12π02πcos2(tω)dω=1/2,Cov(zt,zj)=12π02πcos(tω)cos(jω)dω=0tj.{\displaystyle {\begin{aligned}\mathbb {E} (z_{t})&={\frac {1}{2\pi }}\int _{0}^{2\pi }\cos(t\omega )\,d\omega =0,\\\operatorname {Var} (z_{t})&={\frac {1}{2\pi }}\int _{0}^{2\pi }\cos ^{2}(t\omega )\,d\omega =1/2,\\\operatorname {Cov} (z_{t},z_{j})&={\frac {1}{2\pi }}\int _{0}^{2\pi }\cos(t\omega )\cos(j\omega )\,d\omega =0\quad \forall t\neq j.\end{aligned}}}

So{zt}{\displaystyle \{z_{t}\}} is a white noise in the weak sense (the mean and cross-covariances are zero, and the variances are all the same), however it is not strictly stationary.

Nth-order stationarity

[edit]

InEq.1, the distribution ofn{\displaystyle n} samples of the stochastic process must be equal to the distribution of the samples shifted in timefor alln{\displaystyle n}.N-th-order stationarity is a weaker form of stationarity where this is only requested for alln{\displaystyle n} up to a certain orderN{\displaystyle N}. A random process{Xt}{\displaystyle \left\{X_{t}\right\}} is said to beN-th-order stationary if:[1]: p. 152 

FX(xt1+τ,,xtn+τ)=FX(xt1,,xtn)for all τ,t1,,tnR and for all n{1,,N}{\displaystyle F_{X}(x_{t_{1}+\tau },\ldots ,x_{t_{n}+\tau })=F_{X}(x_{t_{1}},\ldots ,x_{t_{n}})\quad {\text{for all }}\tau ,t_{1},\ldots ,t_{n}\in \mathbb {R} {\text{ and for all }}n\in \{1,\ldots ,N\}}Eq.2

Weak or wide-sense stationarity

[edit]

Definition

[edit]

A weaker form of stationarity commonly employed insignal processing is known asweak-sense stationarity,wide-sense stationarity (WSS), orcovariance stationarity. WSS random processes only require that 1stmoment (i.e. the mean) andautocovariance do not vary with respect to time and that the 2nd moment is finite for all times. Any strictly stationary process which has a finitemean andcovariance is also WSS.[2]: p. 299 

So, acontinuous timerandom process{Xt}{\displaystyle \left\{X_{t}\right\}} which is WSS has the following restrictions on its mean functionmX(t)E[Xt]{\displaystyle m_{X}(t)\triangleq \operatorname {E} [X_{t}]} andautocovariance functionKXX(t1,t2)E[(Xt1mX(t1))(Xt2mX(t2))]{\displaystyle K_{XX}(t_{1},t_{2})\triangleq \operatorname {E} [(X_{t_{1}}-m_{X}(t_{1}))(X_{t_{2}}-m_{X}(t_{2}))]}:

mX(t)=mX(t+τ)for all τ,tRKXX(t1,t2)=KXX(t1t2,0)for all t1,t2RE[|Xt|2]<for all tR{\displaystyle {\begin{aligned}&m_{X}(t)=m_{X}(t+\tau )&&{\text{for all }}\tau ,t\in \mathbb {R} \\&K_{XX}(t_{1},t_{2})=K_{XX}(t_{1}-t_{2},0)&&{\text{for all }}t_{1},t_{2}\in \mathbb {R} \\&\operatorname {E} [|X_{t}|^{2}]<\infty &&{\text{for all }}t\in \mathbb {R} \end{aligned}}}Eq.3

The first property implies that the mean functionmX(t){\displaystyle m_{X}(t)} must be constant. The second property implies that the autocovariance function depends only on thedifference betweent1{\displaystyle t_{1}} andt2{\displaystyle t_{2}} and only needs to be indexed by one variable rather than two variables.[1]: p. 159  Thus, instead of writing,

KXX(t1t2,0){\displaystyle \,\!K_{XX}(t_{1}-t_{2},0)\,}

the notation is often abbreviated by the substitutionτ=t1t2{\displaystyle \tau =t_{1}-t_{2}}:

KXX(τ)KXX(t1t2,0){\displaystyle K_{XX}(\tau )\triangleq K_{XX}(t_{1}-t_{2},0)}

This also implies that theautocorrelation depends only onτ=t1t2{\displaystyle \tau =t_{1}-t_{2}}, that is

RX(t1,t2)=RX(t1t2,0)RX(τ).{\displaystyle \,\!R_{X}(t_{1},t_{2})=R_{X}(t_{1}-t_{2},0)\triangleq R_{X}(\tau ).}

The third property says that the second moments must be finite for any timet{\displaystyle t}.

Motivation

[edit]

The main advantage of wide-sense stationarity is that it places the time-series in the context ofHilbert spaces. LetH be the Hilbert space generated by {x(t)} (that is, the closure of the set of all linear combinations of these random variables in the Hilbert space of all square-integrable random variables on the given probability space). By the positive definiteness of the autocovariance function, it follows fromBochner's theorem that there exists a positive measureμ{\displaystyle \mu } on the real line such thatH is isomorphic to the Hilbert subspace ofL2(μ) generated by {e−2πiξ⋅t}. This then gives the following Fourier-type decomposition for a continuous time stationary stochastic process: there exists a stochastic processωξ{\displaystyle \omega _{\xi }} withorthogonal increments such that, for allt{\displaystyle t}

Xt=e2πiλtdωλ,{\displaystyle X_{t}=\int e^{-2\pi i\lambda \cdot t}\,d\omega _{\lambda },}

where the integral on the right-hand side is interpreted in a suitable (Riemann) sense. The same result holds for a discrete-time stationary process, with the spectral measure now defined on the unit circle.

When processing WSS random signals withlinear,time-invariant (LTI)filters, it is helpful to think of the correlation function as alinear operator. Since it is acirculant operator (depends only on the difference between the two arguments), its eigenfunctions are theFourier complex exponentials. Additionally, since theeigenfunctions of LTI operators are alsocomplex exponentials, LTI processing of WSS random signals is highly tractable—all computations can be performed in thefrequency domain. Thus, the WSS assumption is widely employed in signal processingalgorithms.

Definition for complex stochastic process

[edit]

In the case where{Xt}{\displaystyle \left\{X_{t}\right\}} is a complex stochastic process theautocovariance function is defined asKXX(t1,t2)=E[(Xt1mX(t1))(Xt2mX(t2))¯]{\displaystyle K_{XX}(t_{1},t_{2})=\operatorname {E} [(X_{t_{1}}-m_{X}(t_{1})){\overline {(X_{t_{2}}-m_{X}(t_{2}))}}]} and, in addition to the requirements inEq.3, it is required that the pseudo-autocovariance functionJXX(t1,t2)=E[(Xt1mX(t1))(Xt2mX(t2))]{\displaystyle J_{XX}(t_{1},t_{2})=\operatorname {E} [(X_{t_{1}}-m_{X}(t_{1}))(X_{t_{2}}-m_{X}(t_{2}))]} depends only on the time lag. In formulas,{Xt}{\displaystyle \left\{X_{t}\right\}} is WSS, if

mX(t)=mX(t+τ)for all τ,tRKXX(t1,t2)=KXX(t1t2,0)for all t1,t2RJXX(t1,t2)=JXX(t1t2,0)for all t1,t2RE[|X(t)|2]<for all tR{\displaystyle {\begin{aligned}&m_{X}(t)=m_{X}(t+\tau )&&{\text{for all }}\tau ,t\in \mathbb {R} \\&K_{XX}(t_{1},t_{2})=K_{XX}(t_{1}-t_{2},0)&&{\text{for all }}t_{1},t_{2}\in \mathbb {R} \\&J_{XX}(t_{1},t_{2})=J_{XX}(t_{1}-t_{2},0)&&{\text{for all }}t_{1},t_{2}\in \mathbb {R} \\&\operatorname {E} [|X(t)|^{2}]<\infty &&{\text{for all }}t\in \mathbb {R} \end{aligned}}}Eq.4

Joint stationarity

[edit]

The concept of stationarity may be extended to two stochastic processes.

Joint strict-sense stationarity

[edit]

Two stochastic processes{Xt}{\displaystyle \left\{X_{t}\right\}} and{Yt}{\displaystyle \left\{Y_{t}\right\}} are calledjointly strict-sense stationary if their joint cumulative distributionFXY(xt1,,xtm,yt1,,ytn){\displaystyle F_{XY}(x_{t_{1}},\ldots ,x_{t_{m}},y_{t_{1}^{'}},\ldots ,y_{t_{n}^{'}})} remains unchanged under time shifts, i.e. if

FXY(xt1,,xtm,yt1,,ytn)=FXY(xt1+τ,,xtm+τ,yt1+τ,,ytn+τ)for all τ,t1,,tm,t1,,tnR and for all m,nN{\displaystyle F_{XY}(x_{t_{1}},\ldots ,x_{t_{m}},y_{t_{1}^{'}},\ldots ,y_{t_{n}^{'}})=F_{XY}(x_{t_{1}+\tau },\ldots ,x_{t_{m}+\tau },y_{t_{1}^{'}+\tau },\ldots ,y_{t_{n}^{'}+\tau })\quad {\text{for all }}\tau ,t_{1},\ldots ,t_{m},t_{1}^{'},\ldots ,t_{n}^{'}\in \mathbb {R} {\text{ and for all }}m,n\in \mathbb {N} }Eq.5

Joint (M +N)th-order stationarity

[edit]

Two random processes{Xt}{\displaystyle \left\{X_{t}\right\}} and{Yt}{\displaystyle \left\{Y_{t}\right\}} is said to bejointly (M + N)-th-order stationary if:[1]: p. 159 

FXY(xt1,,xtm,yt1,,ytn)=FXY(xt1+τ,,xtm+τ,yt1+τ,,ytn+τ)for all τ,t1,,tm,t1,,tnR and for all m{1,,M},n{1,,N}{\displaystyle F_{XY}(x_{t_{1}},\ldots ,x_{t_{m}},y_{t_{1}^{'}},\ldots ,y_{t_{n}^{'}})=F_{XY}(x_{t_{1}+\tau },\ldots ,x_{t_{m}+\tau },y_{t_{1}^{'}+\tau },\ldots ,y_{t_{n}^{'}+\tau })\quad {\text{for all }}\tau ,t_{1},\ldots ,t_{m},t_{1}^{'},\ldots ,t_{n}^{'}\in \mathbb {R} {\text{ and for all }}m\in \{1,\ldots ,M\},n\in \{1,\ldots ,N\}}Eq.6

Joint weak or wide-sense stationarity

[edit]

Two stochastic processes{Xt}{\displaystyle \left\{X_{t}\right\}} and{Yt}{\displaystyle \left\{Y_{t}\right\}} are calledjointly wide-sense stationary if they are both wide-sense stationary and their cross-covariance functionKXY(t1,t2)=E[(Xt1mX(t1))(Yt2mY(t2))]{\displaystyle K_{XY}(t_{1},t_{2})=\operatorname {E} [(X_{t_{1}}-m_{X}(t_{1}))(Y_{t_{2}}-m_{Y}(t_{2}))]} depends only on the time differenceτ=t1t2{\displaystyle \tau =t_{1}-t_{2}}. This may be summarized as follows:

mX(t)=mX(t+τ)for all τ,tRmY(t)=mY(t+τ)for all τ,tRKXX(t1,t2)=KXX(t1t2,0)for all t1,t2RKYY(t1,t2)=KYY(t1t2,0)for all t1,t2RKXY(t1,t2)=KXY(t1t2,0)for all t1,t2R{\displaystyle {\begin{aligned}&m_{X}(t)=m_{X}(t+\tau )&&{\text{for all }}\tau ,t\in \mathbb {R} \\&m_{Y}(t)=m_{Y}(t+\tau )&&{\text{for all }}\tau ,t\in \mathbb {R} \\&K_{XX}(t_{1},t_{2})=K_{XX}(t_{1}-t_{2},0)&&{\text{for all }}t_{1},t_{2}\in \mathbb {R} \\&K_{YY}(t_{1},t_{2})=K_{YY}(t_{1}-t_{2},0)&&{\text{for all }}t_{1},t_{2}\in \mathbb {R} \\&K_{XY}(t_{1},t_{2})=K_{XY}(t_{1}-t_{2},0)&&{\text{for all }}t_{1},t_{2}\in \mathbb {R} \end{aligned}}}Eq.7

Relation between types of stationarity

[edit]
  • If a stochastic process isN-th-order stationary, then it is alsoM-th-order stationary for allMN{\displaystyle M\leq N}.
  • If a stochastic process is second order stationary (N=2{\displaystyle N=2}) and has finite second moments, then it is also wide-sense stationary.[1]: p. 159 
  • If a stochastic process is wide-sense stationary, it is not necessarily second-order stationary.[1]: p. 159 
  • If a stochastic process is strict-sense stationary and has finite second moments, it is wide-sense stationary.[2]: p. 299 
  • If two stochastic processes are jointly (M + N)-th-order stationary, this does not guarantee that the individual processes areM-th- respectivelyN-th-order stationary.[1]: p. 159 

Other terminology

[edit]

The terminology used for types of stationarity other than strict stationarity can be rather mixed. Some examples follow.

  • Priestley usesstationary up to orderm if conditions similar to those given here for wide sense stationarity apply relating to moments up to orderm.[3][4] Thus wide sense stationarity would be equivalent to "stationary to order 2", which is different from the definition of second-order stationarity given here.
  • Honarkhah andCaers also use the assumption of stationarity in the context of multiple-point geostatistics, where higher n-point statistics are assumed to be stationary in the spatial domain.[5]

Techniques to stationarize a non-stationary process

[edit]

In time series analysis and stochastic processes, stationarizing a time series is a crucial preprocessing step aimed at transforming a non-stationary process into a stationary one. Several techniques exist for achieving this, depending on the type and order of non-stationarity present. For first-order non-stationarity, where the mean of the process varies over time, differencing is a common and effective method: it transforms the series by subtracting each value from its predecessor, thus stabilizing the mean. For non-stationarities up to the second order, time-frequency analysis (e.g.,Wavelet transform,Wigner distribution function, orShort-time Fourier transform) can be employed to isolate and suppress time-localized, nonstationary spectral components. Additionally, surrogate data methods can be used to construct strictly stationary versions of the original time series. One of the ways for identifying non-stationary times series is theACF plot. Sometimes, patterns will be more visible in the ACF plot than in the original time series; however, this is not always the case.[6]

The choice of method for time series stationarization depends on the nature of the non-stationarity and the goals of the analysis, especially when building models that require strict stationarity assumptions, such as ARMA or spectral-based techniques. More details on some time series stationarization methods are presented below.

Stationarization by means of differencing

[edit]

One way to make some time series first-order stationary is to compute the differences between consecutive observations. This is known asdifferencing. Differencing can help stabilize the mean of a time series by removing changes in the level of a time series, and so eliminating trends. This can also remove seasonality, if differences are taken appropriately (e.g. differencing observations 1 year apart to remove a yearly trend). Transformations such as logarithms can help to stabilize the variance of a time series.

Stationarization by means of the surrogate method

[edit]

The surrogate method for stationarization[7] works by generating a new time series that preserves certain statistical properties of the original series while removing its nonstationary components.[8][9][10] A common approach is to apply the Fourier Transform to the original time series to obtain its magnitude and phase spectra. The magnitude spectrum, which determines the power distribution across frequencies, is retained to preserve the global autocorrelation structure. The phase spectrum, which encodes the temporal alignment of frequency components and is often responsible for time-dependent dynamics in the time series (like non-stationarities), is then randomized, typically by replacing it with a set of random phases drawn uniformly from[π,π]{\displaystyle [-\pi ,\pi ]} while enforcing conjugate symmetry to ensure a real-valued inverse. Applying the inverse Fourier Transform to the modified spectra yields a strictly stationary surrogate time series:[11] one with the same power spectrum as the original but lacking the temporal structures that caused non-stationarity. This technique is often used in hypothesis tests for probing the stationarity property.[8][10][12][13]

See also

[edit]

References

[edit]
  1. ^abcdefgPark, Kun Il (2018).Fundamentals of Probability and Stochastic Processes with Applications to Communications. Springer.ISBN 978-3-319-68074-3.
  2. ^abIonut Florescu (7 November 2014).Probability and Stochastic Processes. John Wiley & Sons.ISBN 978-1-118-59320-2.
  3. ^Priestley, M. B. (1981).Spectral Analysis and Time Series. Academic Press.ISBN 0-12-564922-3.
  4. ^Priestley, M. B. (1988).Non-linear and Non-stationary Time Series Analysis. Academic Press.ISBN 0-12-564911-8.
  5. ^Honarkhah, M.; Caers, J. (2010). "Stochastic Simulation of Patterns Using Distance-Based Pattern Modeling".Mathematical Geosciences.42 (5):487–517.Bibcode:2010MatGe..42..487H.doi:10.1007/s11004-010-9276-7.
  6. ^Hyndman, Rob J.; Athanasopoulos, George. "8.1 Stationarity and differencing".Forecasting: Principles and Practice (2nd ed.). OTexts. Retrieved2016-05-18.
  7. ^Pierre Borgnat and Patrick Flandrin. (2009). Stationarization via surrogates. Journal of Statistical Mechanics: Theory and Experiment, vol. 2009, n. 1,https://iopscience.iop.org/article/10.1088/1742-5468/2009/01/P01001
  8. ^abPierre Borgnat et al. (2010). Testing Stationarity With Surrogates: A Time-Frequency Approach. IEEE Transactions on Signal Processing, vol. 58, n. 7, pp. 3459-3470https://ieeexplore.ieee.org/document/5419113
  9. ^Pierre Borgnat et al. (2011). Transitional Surrogates. 2011 IEEE International Conference on Acoustics, Speech and Signal Processing, pp. 3600-3603https://ieeexplore.ieee.org/document/5946257
  10. ^abDouglas Baptista de Souza et al. (2019). An Improved Stationarity Test Based on Surrogates. IEEE Signal Processing Letters, vol. 26, n. 10, pp. 1431-1435https://ieeexplore.ieee.org/abstract/document/8777090
  11. ^Cédric Richard et al. (2010). Statistical hypothesis testing with time-frequency surrogates to check signal stationarity. 2010 IEEE International Conference on Acoustics, Speech and Signal Processing, pp. 3666-3669https://ieeexplore.ieee.org/document/5495887
  12. ^Douglas Baptista de Souza et al. (2012). A modified time-frequency method for testing wide-sense stationarity. 2012 IEEE International Conference on Acoustics, Speech and Signal Processing, pp. 3409-3412https://ieeexplore.ieee.org/abstract/document/6288648
  13. ^Jun Xiao et al. (2007). Testing Stationarity with Surrogates - A One-Class SVM Approach. 2007 IEEE/SP 14th Workshop on Statistical Signal Processing, pp. 720-724https://ieeexplore.ieee.org/document/4301353

Further reading

[edit]

External links

[edit]
Discrete time
Continuous time
Both
Fields and other
Time series models
Financial models
Actuarial models
Queueing models
Properties
Limit theorems
Inequalities
Tools
Disciplines
Continuous data
Center
Dispersion
Shape
Count data
Summary tables
Dependence
Graphics
Study design
Survey methodology
Controlled experiments
Adaptive designs
Observational studies
Statistical theory
Frequentist inference
Point estimation
Interval estimation
Testing hypotheses
Parametric tests
Specific tests
Goodness of fit
Rank statistics
Bayesian inference
Correlation
Regression analysis (see alsoTemplate:Least squares and regression analysis
Linear regression
Non-standard predictors
Generalized linear model
Partition of variance
Categorical
Multivariate
Time-series
General
Specific tests
Time domain
Frequency domain
Survival
Survival function
Hazard function
Test
Biostatistics
Engineering statistics
Social statistics
Spatial statistics
Retrieved from "https://en.wikipedia.org/w/index.php?title=Stationary_process&oldid=1322862365"
Categories:
Hidden categories:

[8]ページ先頭

©2009-2025 Movatter.jp