Definition:Lag
Jump to navigation
Jump to search
Definition
Let $T$ be a time series.
A lag is a constant time interval between two timestamps of $T$.
Thus, for every observation $z_t$ of $T$, a pair of observations can be created with a given lag $k$ between them:
- $\tuple {z_t, z_{t + k} }$
Sources
- 1994: George E.P. Box, Gwilym M. Jenkins and Gregory C. Reinsel: Time Series Analysis: Forecasting and Control (3rd ed.) ... (previous) ... (next):
- Part $\text {I}$: Stochastic Models and their Forecasting:
- $2$: Autocorrelation Function and Spectrum of Stationary Processes:
- $2.1$ Autocorrelation Properties of Stationary Models:
- $2.1.2$ Stationary Stochastic Processes: Autocovariance and autocorrelation coefficients
- $2.1$ Autocorrelation Properties of Stationary Models:
- $2$: Autocorrelation Function and Spectrum of Stationary Processes:
- Part $\text {I}$: Stochastic Models and their Forecasting:
- 1998: David Nelson: The Penguin Dictionary of Mathematics (2nd ed.) ... (previous) ... (next): lag
- 2008: David Nelson: The Penguin Dictionary of Mathematics (4th ed.) ... (previous) ... (next): lag
- 2014: Christopher Clapham and James Nicholson: The Concise Oxford Dictionary of Mathematics (5th ed.) ... (previous) ... (next): lag