Talk:Expected value
From Wikipedia, the free encyclopedia
[edit] WAY TOO COMPLEX AN EXPLANATION -- NEEDS CRAPPY TAG
Deleted text:
Similarly, in computer science, the expected value of X is defined as
where X is an algorithm with different, weighted subroutines, and i is a particular algorithm path.
Populus 17:52, 16 Aug 2003 (UTC)
[edit] Removed
In general expectation is what is considered the most likely to happen. A less advantageous result gives rise to the emotion of disappointment. If something happens that is not at all expected it is a surprise. See also anticipation.--Jerryseinfeld 01:02, 1 Jan 2005 (UTC)
- Since this has been removed, why does Disappointment redirect here? Quandaryus 09:18, 24 Jan 2005 (UTC)
-
- Fixed the redirect to point to expectation. Ben Cairns 03:46, 2 Feb 2005 (UTC).
[edit] Roulette Table Example
The roulette table example has a flaw - it compares apples to oranges. Either you use the "amount pushed across the table" in each term, or you need to use "net change".
In the "amount pushed across the table" case, I agree that the second term is $36 X 1/38. But, in all cases to play, you have to put $1 down FIRST (it just so happens you get your own dollar back if you win). Using that logic, the formula should be (-$1 X 38/38) + (+$36 X 1/38), which computes out to be about $0.0526.
In the "net change" scenario, I agree that the first term is -$1 X 37/38. But, since one dollar of the 36 you get was yours at the beginning of the spin, you only net $35 on a win. Thus the formula would be (-$1 X 37/38) + (+$35 X 1/38), which still yields about $0.0526. So, one should expect to lose over five cents for every dollar bet.
- You are absolutely right - will you fix it or should I? PAR 9 July 2005 04:17 (UTC)
It seems to me that there's one more point to clarify in the roulette example. The thing is that the random variable here is not the actual outcome of the the roulette (which has a uniform pdf, and whose mean is 18), but an r.v that maps the event space (i.e., the possible outcomes) to either the number 36 (once), or -1 (35 times). For this r.v., the expected value can be correctly interpreted as the amount of money one expect to win (or lose), on average. Am i correct? walter.
"In probability theory (and especially gambling) ..." LOL. (Yes, I know probability theory actually does have its roots in gambling.) -- 130.94.162.64 03:48, 9 December 2005 (UTC)
The expected value formula given seems to work only with a $1 bet. When I plug in, say, a $5 bet and a net win of $175, I get roughly -0.2632, not -0.0526. Shouldn't the formula be: ((probability*payout)/bet)-1
Isn't there an error in the example? Should it not be (-$1 X 37/38) + (+$36 X 1/38) and not (-$1 X 37/38) + (+$35 X 1/38), since you get your $1 back?
[edit] changed
I changed the section "Nonnegative variables", that consisted in a representation formula for the expected values of nonnegative random variables, to a subsection called "representation", in which I write a formula for the general momentum of a random variable. Moreover, I removed (in this subsection) the distinction between continuous and discrete random variables, since the formula holds without distinction. gala.martin
[edit] Roman vs. blackboard bold
Is there a reason the article switches from using to
halfway through, or shall I change them all to roman E's for consistency? TheObtuseAngleOfDoom 21:19, 11 December 2005 (UTC)
- No reason that I know of. PAR 22:30, 11 December 2005 (UTC)
No reason that I know. I would prefer to change all to
as usual in math literature. gala.martin
- Be bold! It's better to have a single form in the article. --Mgreenbe 22:47, 11 December 2005 (UTC)
- I would like EX rather than
as the former is more bearable inline, where it does need to be a png picture, but rather plain text. Wonder what others prefer. Oleg Alexandrov (talk) 00:32, 12 December 2005 (UTC)
- I would like EX rather than
I've gone ahead and been bold, as suggested, switching them all to roman. I also switched the 's to roman as well. TheObtuseAngleOfDoom 14:53, 12 December 2005 (UTC)
-
-
- Thanks! Oleg Alexandrov (talk) 17:48, 12 December 2005 (UTC)
-
[edit] "Fair game" - Expected Value = 0?
I've always thought that a "fair game" is one in which the expected value is 0 - over many repetitions the player stands to neither gain nor lose anything. I don't quite understand the "half stake" that's in their right now (end of intro paragraph). I'm planning on changing it back to the definition that I had put down, but maybe it's just something that I don't know about expected values so I wanted to make sure. -Tejastheory 17:58, 26 December 2005 (UTC)
- Yes, the "stake" additions are wrong. The previous wording was not wonderful either, though. In a simple 2-person game, both players pay a "stake" into a pool, then one of them wins the pool. If the game is fair, then the expected income is half the total stake (not half of one player's stake as it says now). That "half" is only for 2-player games. The expected profit (income minus expenditure) is 0, which is true for fair games with any number of players. We should describe it in terms of profit, without using gambling words like "stake", as that is more general and easier to understand. --Zero 22:41, 26 December 2005 (UTC)
[edit] "Properties the Expected Values has not
We cite some properties the expected values 'has not' (functional non-invariance and non-multiplicativity). It is not significative to write the properties a mathematical object has not. Otherwise, we should write too many... I think it would be better to remove these properties, or to move them at the bottom of the list of properties. This concerns in particoular the "functional non-invariance".
I changed the order of the list of properties, as explained above. Gala.martin 18:36, 28 January 2006 (UTC)
[edit] Question over notion of "fair game"
The article strikes me as okay - except for end of the 2nd paragraph that goes: In gambling or betting, a game or situation in which the expected value for the player is zero (no net gain nor loss) is called a "fair game."
While this seems to be convention (I have several references stating similar) the notion is false.
To determine if a game is fair, the probability of events and the odds offered are insufficent. You also need to consider THE BETTING STRATEGY USED.
This can easily be seen in something I call the "fair bet paradox":
THE FAIR BET PARADOX: Imagine Alice and Bob start with $1000 each and both bet "heads" on an unbiased coin. A "fair bet", right? Well, let Alice bet just $1 per toss while Bob bets HALF HIS CURRENT FUNDS. Under this betting strategy, Alice's funds fluctuate around $1000 while Bob SWIFTLY GOES BROKE. True!
See the word doc "the fair bet paradox" downloadable from www.geocities.com/multigrals2000 for more info. The paradox is not a consequence of the gambler's fallacy or Bob's inital lack of adequate funds. You can offer Bob unlimited credit at 0% interest and he'd still go broke. Likewise if you raise the probability of "heads" to a bit above 0.6 (on which Alice would become rich). You can also solve for the betting strategy of betting a random fraction of your funds BUT THE GENERAL CASE SEEMS TO BE UNSOLVED (True?). Good Luck to anyone who solves it.
I'd like to edit the main page but don't feel confident to do so. If someone else does so, could you please leave the first two paragraphs as they are and perhaps add an explanatory bit below it (in brackets?) as I'd like to do something on the subject latter and would like to refer to the present material. Okay?
Yours, Daryl Williams (www.geocities.com/multigrals2000)
- It's not true that Bob will go broke in this game; there is a non-zero probability that Bob will break _the House_ and win whatever ammount the House has (be it 1 million, 1 billion, or 1 googol). Unless Bob is betting against someone with infinite resources, like Cthulhu. Albmont 12:02, 9 March 2007 (UTC)
[edit] E[f(X)]
Is there a general table giving E[f(X)] for varying functions f and with conditions for X? For example, I know that there is a closed formula for E[exp(X)] whenever X is normal, I could find it under log-normal distribution, but, if I didn't know it, I would be completely lost trying to find it. Albmont 10:02, 19 December 2006 (UTC)
[edit] Assumption missing?
In the "Iterated expectation for discrete random variables" section, isn't the assumption
true only if X and Y are defined over the same probability space?
It says so in the article about the law of total expectation.
Helder Ribeiro 20:11, 2 January 2007 (UTC)
- No. If you sum up the total probability of every event, you have to get 1. Something is going to happen. If the sum of all the events is only 0.9, then there is a 10% chance that Y takes no value at all? That doesn't make sense. Therefore, no matter what spaces things are defined on, the total probability summed over the whole space is always 1. I think that other article is in error. There has to be some joint probability distribution, but X and Y can take values in completely different spaces. - grubber 16:27, 9 March 2007 (UTC)
[edit] How is the expected value different from the arithmetic mean?
This page says that another term for "expected value" is "mean". I find that dubious - especially since the page mean says that the expected value is sometimes called the "population mean" - which I also find to be dubious. If the expected value is the same thing as a mean, then the pages should be merged. If not, this page should explain the difference. Fresheneesz 01:09, 15 February 2007 (UTC)
- Expected value and mean are not the same thing. Means are defined on sets, for example the "arithmetic mean of a set of numbers". Expected values are used in stochastic settings, where you take the expected value of a random variable; there is some underlying probability distribution involved in expected values. I'm not familiar with "population mean", but I have a hard believing that that would be more than just a special case of expected value. You really do need a r.v. in order to take expected values. - grubber 16:20, 9 March 2007 (UTC)
[edit] For two stochastic variables X and Y.
Discrete
Continuous
90.227.190.26 23:19, 5 April 2007 (UTC)