Tom Kando
Here is a baffling mathematical problem which I have pondered for years:
The "Gambler’s Fallacy” and the “Law of Large Numbers" seem to contradict each other:
Consider two statements: (1) “Previous outcomes do not affect the probabilities of the next (similar) event.” Take coin tosses for example, each having a fifty-fifty probability of head or tail, right?
(2) The larger the number of coin tosses, the more likely you are to approach a fifty-fifty distribution of heads and tails, right?
Statement number #2 implies that if you have just tossed a coin twelve times, and ALL twelve of those have resulted in heads (a probability of 1 in 4,096) as you proceed to toss the coin for the thirteenth time, you expect it to come out a tail, and you bet accordingly, as some gamblers sometimes do.
But actually, the smart gambler might be better off betting on head because, given the outcome of the first twelve tosses, there is a chance that the coin was tampered with and is loaded towards “head.”
Statisticians try to explain the irreconcilability of the two statements above by quoting the “law of large numbers.”
In probability theory, the law of large numbers (LLN) is a mathematical theorem which states that the average of the results obtained from a large number of independent and identical random samples converges to the true value, if it exists.
More formally, the LLN states that given a sample of independent and identically distributed values, the sample mean converges to the true mean (Wikipedia).
In other words, the “true” value of coin tosses coming out heads, in the long run, is 50%, as is that for tails. The larger the number of coin tosses is, the closer the distribution of outcomes will come to fifty-fifty.
But this doesn’t help me very much, in my desire to reconcile the LLN with the outcome of each individual coin toss.
An interesting but inconclusive exchange on this: Why dont previous events affect the probability of (say) a coin showing tails?I am talking here about two independent realities:
1. There is the reality of me tossing a coin ONE single time: Every time I do this, the probabilities of the toss coming out head or tail are equal, namely 50%. A coin has no memory. Its behavior cannot be influenced by previous events.
2. And then there is the law of large numbers: The more often I toss the coin, the more likely it is that the number of heads and the number of tails will each approach roughly half of the total number of tosses.
#1 and #2, above, are independent from each other. They are ruled by different rules, and they apply to different events. One is a one-time event. The other is about a large number of events. Within that SERIES of events, each toss of the coin is independent, but the prediction is about the collective outcome of, say, hundred, or a million tosses or even an infinite number of tosses, in which case the ratio of heads to tails will indeed be 1 (50/50).
Does the LLN govern the outcome of coin tosses?
Yes, but only when many tosses occur. The larger that number is, the more the outcome conforms to the law. But for any individual coin toss, the law has no applicability. That’s why it is called the law of large numbers. My desire to reconcile the independence of single tosses with the law of large numbers cannot be satisfied because the law does not apply to single individual events.
I appeal to mathematicians and statisticians to correct, amplify or otherwise comment on my analysis.
leave comment here
In other words, the “true” value of coin tosses coming out heads, in the long run, is 50%, as is that for tails. The larger the number of coin tosses is, the closer the distribution of outcomes will come to fifty-fifty.
But this doesn’t help me very much, in my desire to reconcile the LLN with the outcome of each individual coin toss.
An interesting but inconclusive exchange on this: Why dont previous events affect the probability of (say) a coin showing tails?
I am talking here about two independent realities:
1. There is the reality of me tossing a coin ONE single time: Every time I do this, the probabilities of the toss coming out head or tail are equal, namely 50%. A coin has no memory. Its behavior cannot be influenced by previous events.
2. And then there is the law of large numbers: The more often I toss the coin, the more likely it is that the number of heads and the number of tails will each approach roughly half of the total number of tosses.
#1 and #2, above, are independent from each other. They are ruled by different rules, and they apply to different events. One is a one-time event. The other is about a large number of events. Within that SERIES of events, each toss of the coin is independent, but the prediction is about the collective outcome of, say, hundred, or a million tosses or even an infinite number of tosses, in which case the ratio of heads to tails will indeed be 1 (50/50).
Does the LLN govern the outcome of coin tosses?
Yes, but only when many tosses occur. The larger that number is, the more the outcome conforms to the law. But for any individual coin toss, the law has no applicability. That’s why it is called the law of large numbers. My desire to reconcile the independence of single tosses with the law of large numbers cannot be satisfied because the law does not apply to single individual events.
I appeal to mathematicians and statisticians to correct, amplify or otherwise comment on my analysis.
leave comment here