Definition talk:Expectation

From ProofWiki
Jump to navigation Jump to search

It seems like this page is missing something. Let's say, for example, we have $X \sim \operatorname U \closedint 0 1$, and a function $f$ as follows:

$\map f x = \begin {cases} x & x < \dfrac 1 2 \\ 3 / 4 & x \ge \dfrac 1 2 \end{cases}$

Now, the distribution $\map f X$ is neither discrete nor continuous, but intuitively it does have an expectation of $\frac 1 2$. Plokmijnuhby (talk)

And so exactly why should that missing thing be on this page in particular? --prime mover (talk) 09:58, 9 March 2019 (EST)
This particular distribution was meant as an example of a general case, to show that there are many distributions that aren't continuous or discrete. The page gives no way to calculate the expectation for them, which I believe it should. --Plokmijnuhby (talk) 13:02, 9 March 2019 (EST)
I've had an idea how to do this.

"Let $X$ be a random variable.

Let $X_1, X_2, \ldots$ be a sequence of independent variables, with the same distribution as $X$.

Let:

$\ds S_n = \sum_{i \mathop = 1}^n X_i$

Now a real number $\mu$ is the expectation of $X$ if and only if:

$\dfrac {S_n} n \xrightarrow D \mu$ as $n \to \infty$

that is, converges in distribution to a variable that always takes the value $\mu$."

This would also have the effect of joining the two existing definitions into one. It's likely that some pages which link to this one would need adjustment, though. Plokmijnuhby (talk) 15:58, 9 March 2019 (EST)
On second thought, maybe "converges in distribution" is too hand-wavy. Here's an improvement to the last section:

Now a real number $\mu$ is the expectation of $X$ if and only if:

$\ds \forall \epsilon \in \R : \epsilon > 0 : \lim_{n \mathop \to \infty} \map \Pr {\size {\frac {S_n} n - \mu} > \epsilon} = 0$
Plokmijnuhby (talk) 17:35, 9 March 2019 (EST)
This particular line of thought could be very good. There has been some work in establishing the foundations of probability theory, but admittedly this page has not been included.
If such a rewrite/generalisation is undertaken, we should take care that the "simple" cases of discrete and continuous RV remain available.
To ensure the validity of the material, it would also be helpful to have one or more source works that use this definition of expectation. Bonus points for proofs that the "simple" cases arise by applying the definition. — Lord_Farin (talk) 04:52, 10 March 2019 (EDT)

Measure Theoretic Definition

I think I can fix this page. My idea is to limit the continuous rv definition to the case where the rv is Riemann integrable and then add a third definition for full generality using Folland's definition. Is that a reasonable approach? What should I name the measure theoretic definition?

If $X$ is a $\Sigma$-measurable function in $\struct {\Omega, \Sigma, \Pr}$

$\ds \expect X := \int_\Omega X \rd \Pr$

--GFauxPas (talk) 21:40, 11 February 2020 (EST)

You mean this page: Definition:Expectation of Continuous Random Variable? --prime mover (talk) 04:31, 12 February 2020 (EST)
Yes. That uses notation that I don't know how to fix, as called out by the {explain} template. (I deleted a comment about Lebesgue integration, ignore what I said about that, it's not enough for all cases). --GFauxPas (talk) 08:53, 12 February 2020 (EST)
I don't know enough about this area of maths to be able to make a definitive statement, but it appears that as a probability distribution is just a measure with some further structure, we could do worse than use the language of measure theory. I have a few books on the subject but have never had the patience to concentrate on them for long enough to get to the meat of what it's all about, it takes too long to fumble through the preliminaries. One day, though. --prime mover (talk) 12:13, 12 February 2020 (EST)
Indeed a probability distribution is a particular type of measure. The "correct" definition is the measure theoretical one, but for pedagogical reasons there's an advantage to separately defining the case where the reader only needs to know Riemann integration. --GFauxPas (talk) 15:09, 12 February 2020 (EST)


Okay so how many pages and what's transcluded into what and how should I name the pages? I have a definition that works only if $x f_X$ is absolutely integrable as an improper integral; that one is limited but only requires calculus II knowledge. Then I have the measure theoretic version which works for all random variables, continuous/discrete/mixed/whatever. --GFauxPas (talk) 20:40, 13 February 2020 (EST)

Shame I'm late to the party (only learned measure theory over the summer). I agree that $\int x \rd F$ does not make any sense when read as a Lebesgue integral, since $F$ is not a measure, (it accepts scalars not sets) it's a correct definition when read as the Riemann-Stieltjes integral though. My current plan for the page is to have a "General Definition", which is the $\expect X = \int X \rd \Pr$ (given integrable $X$) then continuous and discrete after. Then the "usual":
$\ds \expect X = \int x \rd \map {X_* \Pr} x$
follows from Integral with respect to Pushforward Measure. Will start to make some headway soon. Caliburn (talk) 22:53, 26 December 2021 (UTC)
Would you be in a position (as far as you understand) to explain the various notations (and notational abuses) in a subpage? E.g. "/Notation" or "/Also denoted as" or whatever, according to whatever may be relevant? --prime mover (talk) 23:03, 26 December 2021 (UTC)
Something like this? Caliburn (talk) 13:51, 27 December 2021 (UTC)
Completely opaque to me, unfortunately, but I think I need to do a lot of studying of what's there so as to be able to make sense of it. I'll look at it again when I'm not so unbearably tired. Sorry, but nothing's making any sense to me at the moment. --prime mover (talk) 14:15, 27 December 2021 (UTC)
Any particular points of confusion/ambiguity? Caliburn (talk) 16:03, 27 December 2021 (UTC)
Yes I would need to hunt down a link to a random variable but the screen is swimming in front of my eyes --prime mover (talk) 19:47, 27 December 2021 (UTC)