# Principle of Parsimony

## Mathematical Principle

Let $P$ be a stochastic process which is being modelled by means of a stochastic model $M$.

$M$ will necessarily use a number of constants and parameters whose values are to be determined by estimation from the data.

The **principle of parsimony** dictates that $M$ employs the **smallest possible number** of parameters such that $M$ will adequately represent the behaviour of $P$.

## Examples

### Arbitrary Dynamic Model

Let $M$ be a dynamic model of the form:

- $(1): \quad Y_t = \paren {\omega_0 - \omega_1 B - \omega_2 B^2 - \dotsb - \omega_s B^2} X_t$

when dealing with a system that could be adequately presented by:

- $(2): \quad \paren {1 - \delta B}^{-1} \omega_0 X_t = \omega_0 \paren {1 + \delta B + \delta^2 B^2 + \dotsb} X_t$

where $\size \delta < 1$.

Because of experimental error, it is quite possible to miss noticing the relationship between the coefficients in the fitted equation.

Hence we may end up fitting a relationship like $(1)$ which contains $s + 1$ parameters, where the simpler form $(2)$, which has only $2$ parameters, would have been adequate.

Hence the estimation of the output $Y_t$ may be unnecessarily poor for the given values of $X_t, X_{t - 1}, \dotsb$.

## Sources

- 1961: John W. Tukey:
*Discussion, Emphasizing the Connection between Analysis of Variance and Spectrum Analysis*(*Technometrics***Vol. 3**: pp. 191 – 219) www.jstor.org/stable/1266112

- 1994: George E.P. Box, Gwilym M. Jenkins and Gregory C. Reinsel:
*Time Series Analysis: Forecasting and Control*(3rd ed.) ... (previous) ... (next):

- $1$: Introduction:
- $1.3$ Basic Ideas in Model Building:
- $1.3.1$ Parsimony

- $1.3$ Basic Ideas in Model Building:

- $1$: Introduction: