Category:Information (Estimation Theory)
Jump to navigation
Jump to search
This category contains results about information in the context of estimation theory.
Definitions specific to this category can be found in Definitions/Information (Estimation Theory).
Let $L$ be the logarithm of the likelihood function of a parameter $\theta$.
The amount of information of $\theta$ is given by:
- $I := \map E {\paren {\dfrac {\partial L} {\partial \theta} }^2}$
Subcategories
This category has the following 2 subcategories, out of 2 total.
I
- Information Matrices (empty)
M
- Minimum Variance Unbiased Estimators (empty)
Pages in category "Information (Estimation Theory)"
The following 3 pages are in this category, out of 3 total.