This morning I saw a tweet mentioning an article with the exciting title:

Counterintuitive problem: Everyone in a room keeps giving dollars to random others. You’ll never guess what happens next

So naturally, I was quite excited and wanted to see what happens next! The problem under consideration is:

Imagine a room full of 100 people with 100 dollars each. With every tick of the clock, every person with money gives a dollar to one randomly chosen other person. After some time progresses, how will the money be distributed?

Dan Goldstein, writer of the article argues that many people believe that the money gets more or less evenly distributed. Sometimes you get some, sometimes you lose some, that’s the idea.

Turns out, this is not really the case (or actually really not). To show this, Dan Goldstein made a simulation in R and a movie to show the results. What you see is the amount of money each person has for each round. The bottom figure is the interesting one, individuals (on the x-axis) are sorted by the amount of money they possess, so you can actually see the distribution of the money as time passes.

The simulation made by Dan, is an exact simulation. For each round, each player with money picks at random the player to give their dollar to. Then all the money is counted and the next round can begin. I was wondering if I could come up with a simulation based on probability and statistics.

Here is my train of thought:

\begin{align} \Pr(X = k) = {n \choose k}p^k(1-p)^{n-k} \end{align}

Now we can draw samples from this distribution, one for each player of the game in a single round. This sample thus denotes the amount of money each single player receives in this round. This amount should naturally equal the amount that is spend each round. However, because the amount that is received is now stochastic (sampled from a binomial distribution) there is no guarentee that this is the case.

Nonetheless, we know the mean of the binomial distribution equals . So each player is expected to gain a single dollar each round. Because this is also the amount each person gives away, the expected amount of money in the game will remain constant. For now, I am quite okay with this assumption. And in fact, this makes sure that the expected total amount of money in the game is stable.

With the binomial distribution we can generate the following (incomplete) pay-off table.

Money lost Money gained Net money Probability
1 0 -1 36.6%
1 1 0 37.0%
1 2 1 18.5%
1 3 2 6.1%
1 4 3 1.5%
1 5 2 0.3%
.. .. .. ..
1 99 98 0.0%

The following video shows a single run of 50 people distributing money for around 5000 rounds, they start with 50 dollar each.

Now the thing is, that (in hindsight admittedly), I don’t think that the result is actually so counter-intuitive. And for that we will make a de-tour through coin-flipping. Consider a fair coin so that the chance of getting heads for a single flip is . Now, if I flip the coin 10 times, I expect to see 5 ( = ) heads. Let’s do this a couple of times and note the amount of heads we see after 10 flips:

4

6

5

4

4

5

5

9 (!!)

And there is it, although we very often see values close to the expected value (5), if we repeat this experiment long enough we run into an example where we see 9 heads! That’s a lot! When this happens in real life, people often shout at eachother: “what a coincidence!!!”. For example when they ran into somebody they know while on vacation in a distant country.

Although if you think about it for a minute, it is not such a big coincidence at all. It’s just the result of repeating the same experiment over and over. Most of the time you will end up with the thing you expect, but sometimes you get a totally different result. Now to return to the topic of this post, most of the people will remain around the amount of money they initially start with. In some round they will gain some money, in other rounds they will lose some money. But there are also some lucky people, which will be on a streak for 10 or even 20 rounds and gather a far amount of money over this time. For the same reasons there will also be players with a losing streak. This will not happen for many people, but remember we are playing the game with 100 persons. And the chance that this happens to at least one of the 100 players is actually quite large.

Stated in other words, the chance of a winning streak happening to one specific player, is very small. But when we have 100 players, which all have a small chance of going on a winning (or losing) streak, we definitely expect at least one of the players, to actually succeed.

Eventhough individual chances are very small, simply because we repeat the experiment very often we expect some freak results.

Edit based on a question from Dan Goldstein

This model works fine for as long as all players have at least 1 dollar. If this is not the case, then those people who don’t have money, cannot participate in giving money (naturally, they can still receive money). So far, the model does not take this into account. Initially, I thought the model could be easily extended to cover this case. When we denote the amount of people who don’t have money in round by then we set the parameters for the binomial distribution in round to:

\begin{align} n_{(i)} &= m - 1 - l_{(i)} \\\
p_{(i)} &= \frac{1}{1 - m} \end{align}

The probability that a player receives payment from another one thus remains the same, but the amount of people that distribute money is reduced by the people without money. Seems easy, seems logical. However, if we compute the total amount to be spend () in each round no longer equals the expected amount to be received . As can be seen from:

\begin{align} m \cdot n \cdot p &\stackrel{?}{=} m - l \\\
m \cdot (m - 1 -l) \cdot \frac{1}{m-1} &\stackrel{?}{=} m - l \\\
m^2 - m - ml &\stackrel{?}{=} (m - l)(m - 1) \\\
0 &\stackrel{?}{=} l \\\
\end{align}

Which does not hold whenever . Thinking a little more about the problem one can see that we actually have two different regimes, depending on people have money or not. For those people with money, the rules above for and are correct. But for people without money, the amount of people who can give them money becomes: . Which translates to all the other people minus the people who don’t have any money, except me . Let’s do the math to see if this checks out:

\begin{align} m \cdot n \cdot p &\stackrel{?}{=} m - l \\\\
(m-l) \cdot \frac{m-1-l}{m-1} + l \cdot \frac{m-l}{m-1} &\stackrel{?}{=} m - l \\\
\frac{m^2 - lm - m + l}{m-1} &\stackrel{?}{=} m - l \\\
\frac{(m-1)(m-l)}{m-1} &\stackrel{?}{=} m - l \\\
m - l &= m - l \end{align}

And it does! This makes the updating of the bank in each round a little more complicated, but not a lot! And now the simulation can also deal with people going broke! I like!

import numpy as np
import matplotlib.pyplot as plt
from matplotlib import animation

amount_of_people = 100

n = amount_of_people - 1
p = 1 / n
start_money = 100

bank = start_money*np.ones([amount_of_people])

range = np.arange(amount_of_people)

fig = plt.figure()
ax = fig.add_subplot(1,1,1)

bar = plt.bar(range,bank)

def init():
    return bar

def animate(i):
    global bank, ax
    people_without_money = sum(bank==0)
    smpl_w_money = np.random.binomial(amount_of_people - 1 - people_without_money, p, amount_of_people)
    smpl_wo_money = np.random.binomial(amount_of_people - people_without_money, p, amount_of_people)
    smpl_w_money[bank == 0] = 0
    smpl_wo_money[bank > 0] = 0
    sample = smpl_w_money + smpl_wo_money
    bank = bank + sample - 1*(bank > 0)
    bank = np.sort(bank)
    for rect, y in zip(bar, bank):
        rect.set_height(y)
    ax.set_ylim(0,max(bank))
    ax.set_title(i)
    #print(sum(bank)/amount_of_people)

anim = animation.FuncAnimation(fig, animate, init_func=init,
                               frames=1500, interval=1, blit=False)
plt.show()