Generalized Linear Model over Normal Distribution reduces to Linear Regression Model
Jump to navigation
Jump to search
Theorem
Let $Y$ be a random variable which obeys a normal distribution.
Let a generalized linear model for one variable for $Y$ be:
- $\mu = \expect Y = \beta_0 + \beta_1 x$
whose variance is $\sigma^2$.
Then the generalized linear model reduces to the linear regression model.
Proof
This theorem requires a proof. You can help $\mathsf{Pr} \infty \mathsf{fWiki}$ by crafting such a proof. To discuss this page in more detail, feel free to use the talk page. When this work has been completed, you may remove this instance of {{ProofWanted}} from the code.If you would welcome a second opinion as to whether your work is correct, add a call to {{Proofread}} the page. |
Sources
- 1998: David Nelson: The Penguin Dictionary of Mathematics (2nd ed.) ... (previous) ... (next): generalized linear models
- 2008: David Nelson: The Penguin Dictionary of Mathematics (4th ed.) ... (previous) ... (next): generalized linear models