wu :: forums (http://www.ocf.berkeley.edu/~wwu/cgi-bin/yabb/YaBB.cgi)
riddles >> easy >> Infinite Balls Urn
(Message started by: kiochi on Jan 4th, 2007, 1:15pm)

Title: Infinite Balls Urn
Post by kiochi on Jan 4th, 2007, 1:15pm
A friend told me this one. Maybe it's too easy or uninteresting to be a good riddle, but here goes anyway.

So suppose Mindy has an infinite number of (numbered) balls and an urn that can fit as many balls as necessary. We also assume that Mindy has the ability to do things "really fast."

At 1 minute until midnight, Mindy puts balls 1-10 into the urn, then immediately takes out ball 1 (i.e. at 1 minute till midnight, balls 2-10 are in the urn).

Next, at 1/2 minute till midnight, Mindy puts balls 11-20 into the urn and takes out ball 11. At 1/3 minute till midnight, Mindy puts in balls 21-30 and takes out ball 21.

The process continues, with the nth iteration occuring at 1/n minutes till midnight. How many balls are in the urn at midnight?

The next day, Mindy decides to mix it up. She starts out again with an empty urn and a bunch of balls. At 1 minute till midnight, Mindy puts in balls 1-10, and takes out ball 1. At 1/2 minute till midnight she puts in balls 11-20 and takes out ball 2. At 1/3 minute till midnight she puts in balls 21-30 and takes out ball 3, and so on.

How many balls left at midnight?

Hint: Mindy got a different answer this second time.

Mindy, confused by the new answer wanted to try a third experiment.

She bought completely new balls with no numbers or identifying characteristics. At 1 minute till midnight she puts in ten of these balls and takes out 1 of them at random. At 1/2 minute till midnight, she puts in 10 more balls and takes out another one at random, and so on. How many did she end up with this last time, or can you say? Will she get the same number each time?


Title: Re: Infinite Balls Urn
Post by towr on Jan 4th, 2007, 1:52pm
A great riddle to teach someone the perils of infinity..
(Although unfortunately, in other places, many people never seem to get it, and just cop out saying it isn't physically possible.)

Title: Re: Infinite Balls Urn
Post by THUDandBLUNDER on Jan 4th, 2007, 2:40pm
http://www.ocf.berkeley.edu/~wwu/cgi-bin/yabb/YaBB.cgi?board=riddles_hard;action=display;num=1117490596

Title: Re: Infinite Balls Urn
Post by honkyboy on Jan 4th, 2007, 3:18pm
I don't understand the difference between the three scenarios.   ???
Each time she effectively puts in nine balls.  Why wouldn't the number of balls in the bin always be 9n ?

**Edit -- For each of your questions, how many balls would you say she removed from the urn?

Title: Re: Infinite Balls Urn
Post by towr on Jan 4th, 2007, 3:29pm

on 01/04/07 at 15:18:14, honkyboy wrote:
I don't understand the difference between the three scenarios.   ???

A hint [hide]Identify which (numbered) balls are left in the bucket.[/hide]

Title: Re: Infinite Balls Urn
Post by honkyboy on Jan 4th, 2007, 4:08pm
Ok. Here's how I'm looking at it.
[hide]1st scenario:Removed from the urn are all number which end in 1. Therefore remaining in the urn are all numbers except for those that end in 1.
 
2nd scenario: Removed from the urn are the bottom tenth of all numbers, or all numbers < infinity/10.  Remaining in the urn is all numbers > infinity/10.

3rd scenario: Well she adds an infinite number of balls and randomly take away %10 of them.

It seems to me that she will always have an infinite number of balls inside and out of the urn when she is done.  What am I missing here?
One thing for sure is that Mindy has got alot of balls!
[/hide]

Title: Re: Infinite Balls Urn
Post by THUDandBLUNDER on Jan 4th, 2007, 4:29pm
But infinity/10 = infinity
Infinite numbers do not obey the same algebraic laws as finite numbers.

Title: Re: Infinite Balls Urn
Post by honkyboy on Jan 4th, 2007, 5:06pm

on 01/04/07 at 16:29:13, THUDandBLUNDER wrote:
But infinity/10 = infinity
Infinite numbers do not obey the same algebraic laws as finite numbers.
I see.  I wrote down what number infinity/10 is on a bar napkin, I just can't find it right now.  ;)


A graph the number of balls in the urn as a function fo time is a steep curve along the asymptote of t=midnight.  With no numbers on the balls this curve intersects the asymptote at n=infinity.  

Where does Mindy's progress deviate from this curve using numbered balls?  


Title: Re: Infinite Balls Urn
Post by THUDandBLUNDER on Jan 4th, 2007, 5:27pm

on 01/04/07 at 17:06:26, honkyboy wrote:
Where does Mindy's progress deviate from this curve using numbered balls?  

At infinity.   :)

There is a discontinuity.

LIMITf(x) = b does NOT imply f(a) = b
x -> a

Title: Re: Infinite Balls Urn
Post by honkyboy on Jan 4th, 2007, 5:42pm
Thank you.  

Title: Re: Infinite Balls Urn
Post by THUDandBLUNDER on Jan 4th, 2007, 5:56pm
You are welcome.

There is also a long discussion here (http://www.ocf.berkeley.edu/~wwu/cgi-bin/yabb/YaBB.cgi?board=riddles_hard;action=display;num=1049309591;start=0).

Title: Re: Infinite Balls Urn
Post by towr on Jan 5th, 2007, 12:33am
In scenario two, it's important to see that a ball with number n is removed 1/n minutes before midnight. And since all balls are numbered, there can't be any left.
Any identifiable ball is removed, and all balls were identifiable. (In contrast to the first scenario were you can identify an infinite number of balls that must still be in the bucket; and the third scenario were balls aren't identifiable at all.).

Title: Re: Infinite Balls Urn
Post by Grimbal on Jan 5th, 2007, 6:52am
In every case, the number of balls is
   infinity - infinity = undefined.

Title: Re: Infinite Balls Urn
Post by kiochi on Jan 5th, 2007, 11:09am

on 01/05/07 at 06:52:23, Grimbal wrote:
In every case, the number of balls is
   infinity - infinity = undefined.



I think that it's only "undefined" in the third case. As towr has pointed out, in the first two scenarios we can identify exactly which balls remain and which are removed.

In the third case I guess it's reasonable to say that it's "undefined," but, following the reasoning from the first two, I think it makes sense to say that there could be any number of balls from 0 to infinity left in the urn.

Title: Re: Infinite Balls Urn
Post by Icarus on Jan 5th, 2007, 6:50pm

on 01/05/07 at 06:52:23, Grimbal wrote:
In every case, the number of balls is
   infinity - infinity = undefined.


http://www.ocf.berkeley.edu/~wwu/YaBBImages/symbols/infty.gif - http://www.ocf.berkeley.edu/~wwu/YaBBImages/symbols/infty.gif is not undefined because you can't ever get a value. It is undefined because the value that you get differs from situation to situation. You have to determine the result by a logical examination of the situation, rather than relying on a simple subtraction to get your answer.

The logical examinations reveal that in the first case, the number of balls remaining is http://www.ocf.berkeley.edu/~wwu/YaBBImages/symbols/infty.gif. In the second case, it is 0. And in the third case, it really is indeterminate from the information given.

Title: Re: Infinite Balls Urn
Post by Grimbal on Jan 6th, 2007, 8:55am
Now that you say it, I think "undeterminate" is better than "undefined".

What I mean is that from the fact that you add an infinity of balls and you remove an infinity, you cannot tell anything.  That is why I say "infinity - infinity = undeterminate".  And that is what the 3 cases have in common.  But by numbering the balls, you can sometimes determine which ball (and therefore how many) remain in the urn.

This explains the paradox that by doing essentially the same thing, you get different results.


You also can do something worse.
You add balls #0-9, 10-19, 20-29, etc. and remove balls #0, 1, 2, etc.  Move 1 is "add #0-9, remove #0".  Move 2 is "add #10-19, remove #2", etc.  Let's call that the base sequence of moves.
Then you interleave multiple copies of this sequence.  Every 100 moves, at move n·100. you  add in move n from the base sequence.  After every 10000 moves, at move n·10000 you add in move n·100 but also move n.  And in general, every move n·100k, you add in move n·10i for i=k-1,...,0.
That means that every ball gets added and removed an infinite number of times.  In that case, even if all moves a precisely identified, it is impossible to tell whether a particular ball is in the urn after an infinite number of moves.

Title: Re: Infinite Balls Urn
Post by towr on Jan 7th, 2007, 7:11am
You can arrive at that problem a lot easier:
 At odd n add ball 1, at even n remove ball 1.
 How many are left after an infinite number of moves?

What I was wondering about case three, is if you did number the balls, but randomly selected one of the numbered balls for removal. Could you say anything about the probability that the bucket ends up empty or not?

Title: Re: Infinite Balls Urn
Post by Eigenray on Jan 7th, 2007, 2:31pm

on 01/07/07 at 07:11:34, towr wrote:
What I was wondering about case three, is if you did number the balls, but randomly selected one of the numbered balls for removal. Could you say anything about the probability that the bucket ends up empty or not?


It would [hide]be empty with probability 1[/hide].  But if you were to add something like n2 or 2n balls on the n-th step, it gets more complicated.

Title: Re: Infinite Balls Urn
Post by towr on Jan 7th, 2007, 2:49pm
I'm quite curious about the reasoning behind that.

And also, does the result transfer to case three as originally stated?
(Obviously to remove the first or last ball we have to be able to identify them; but to uniformly randomly remove a ball, we don't really need that, do we? Yet these problems are counterintuitive enough to make me doubt.)

Title: Re: Infinite Balls Urn
Post by THUDandBLUNDER on Jan 7th, 2007, 4:56pm

on 01/07/07 at 14:31:18, Eigenray wrote:
It would [hide]be empty with probability 1[/hide].

So it's not certain then?   ;)

Title: Re: Infinite Balls Urn
Post by Icarus on Jan 7th, 2007, 8:56pm

on 01/07/07 at 16:56:23, THUDandBLUNDER wrote:
So it's not certain then?   ;)


Nope. After all, if you pick a real number between 0 and 1 at random with uniform probability, the probability of getting an irrational number is also 1.

Title: Re: Infinite Balls Urn
Post by THUDandBLUNDER on Jan 8th, 2007, 7:16am

on 01/07/07 at 20:56:24, Icarus wrote:
Nope. After all, if you pick a real number between 0 and 1 at random with uniform probability, the probability of getting an irrational number is also 1.

Thanks; just as I expected.   :)

Title: Re: Infinite Balls Urn
Post by kiochi on Jan 8th, 2007, 11:58am

on 01/07/07 at 14:49:32, towr wrote:
I'm quite curious about the reasoning behind that.

And also, does the result transfer to case three as originally stated?
(Obviously to remove the first or last ball we have to be able to identify them; but to uniformly randomly remove a ball, we don't really need that, do we? Yet these problems are counterintuitive enough to make me doubt.)



I'm also very curious about the reasoning.

My intuition is that choosing numbered balls at random is equivalent to having blank balls. E.g. number the balls with magic ink that only an observer (we) can see, while Mindy thinks she has blank balls.


Title: Re: Infinite Balls Urn
Post by Icarus on Jan 8th, 2007, 5:16pm

on 01/08/07 at 07:16:00, THUDandBLUNDER wrote:
Thanks; just as I expected.   :)


I figured you were being facetious, but I wanted to make sure other readers who are less familiar with probability understood the difference between probability = 1 and certainty.

Title: Re: Infinite Balls Urn
Post by Eigenray on Jan 9th, 2007, 3:02pm

on 01/07/07 at 14:49:32, towr wrote:
I'm quite curious about the reasoning behind that.


Consider ball b, which gets added at stage k=ceiling(b/10).  What is the probabilitiy that this ball is never removed?

During stage k, there are 9k+1 balls to choose from, so the probabilitiy that ball b is not removed is 1 - 1/(9k+1).  And given that ball b is still there after stage s-1, the probability that it is not removed during stage s is 1 - 1/(9s+1).  So the probability that ball b is never removed is
p = [1 - 1/(9k+1)]*[1 - 1/(9k+10)]*[1 - 1/(9k+19)]*... = 0.

(It's a standard result that if {an} is a sequence of positive numbers such that [sum] an diverges, but [sum] an2 converges, then [prod](1 - an) = 0, and this applies to an = 1/[9n + 9k-8].)

Thus each ball gets removed with probability 1, and a countable intersection of probability 1 events also has probability 1.  So the urn is almost surely empty.


Of course what I'm sweeping under the rug here is that this actually is a well-defined probability space.  This can be seen as follows.  Define probability measures pn on Nn by pn(b1,...,bn) = 1/[10*19*...*(9n+1)] if b1,...,bn are distinct and each bk < 10k; otherwise set pn(b) = 0.  This is the probability that we pick ball bk during stage k, for 1<k<n.

These measures are consistent in that if E is a subset of Nn, then pn+1(E x N) = pn(E).  Thus by the Kolmogorov extension theorem, there is a unique probability measure P on NN such that for any b1,...,bn in N,

P((b1,...,bn)xNN) = pn(b1,...,bn).

The domain of P is the sigma-algebra generated by subsets of NN with a given finite prefix, and these are all basic open sets for the product topology on NN.  Fix a ball b, and let A be the event that b is never removed.  Let As be the event that ball b is not removed during the first s stages.  Then for s > k = ceiling(b/10),

P(As) = [1 - 1/(9k+1)]*...*[1 - 1/(9s+1)],

since As is the disjoint union of [10*19*...*(9(k-1)+1)]*[(9k)...(9s)] basic open sets, each of the form (b1,...,bs)xNN.

Now, A is the intersection of all the As, hence measurable, and since P(As) -> 0, we have P(A)=0, and the conclusion follows as before.

Title: Re: Infinite Balls Urn
Post by WombatDeath on Jan 9th, 2007, 3:20pm

on 01/08/07 at 17:16:58, Icarus wrote:
I figured you were being facetious, but I wanted to make sure other readers who are less familiar with probability understood the difference between probability = 1 and certainty.

I hadn't come across that concept, so thank you for giving me something new to think about (and for making my brain bleed).  I'm going to take a stab at the meaning of your comment and hope for the best:

The probability of not picking ball 837 from an infinite number of balls is 1 (c.f. the discussion about 1 = 0.999...).  Therefore the probability of picking ball 837 is zero.  However, you can't be certain that you won't do so anyway through some bizarre quirk of fortune.  So the likelihood of not picking ball 837 isn't certain, even though it has a probability of 1.

And I suppose the corollary is that having a probability of zero doesn't necessarily make an event impossible.

Is that vaguely correct?

Title: Re: Infinite Balls Urn
Post by towr on Jan 9th, 2007, 3:34pm

on 01/09/07 at 15:20:13, WombatDeath wrote:
Is that vaguely correct?
Yep.

Obviously it's not just the case for 837; it holds (at the same time) for every number in an infinite set that picking it has probability 0 *).
But if you pick a number from that set, you have to end up with one. So you can't avoid getting a number that has zero probability.


*)Although it does depend on the probability distribution, of course.

Title: Re: Infinite Balls Urn
Post by Icarus on Jan 9th, 2007, 5:43pm
The example you gave doesn't quite work, because probability goes wonky when you try to make it uniform on unbounded sets, such as all natural numbers. However bounded infinite sets are better behaved:

Choose a real number at random from the interval [0,1]. This means that probability that the selected number lies between a and b is |b-a|. For any given single number x in [0,1], the probability that x will be the number you selected is 0. But since this holds for all values, every time a number is selected, it happens dispite that 0 probability.

Title: Re: Infinite Balls Urn
Post by Eigenray on Jan 9th, 2007, 5:55pm

on 01/09/07 at 15:20:13, WombatDeath wrote:
The probability of not picking ball 837 from an infinite number of balls is 1

That depends on how you propose to pick one ball from an infinite number of balls.  Once you go beyond the physically realizable situation of making finitely many choices from finitely many possibilities, you need to turn to the mathematically realizable situation of a [link=http://en.wikipedia.org/wiki/Probability_space]probabilitiy space[/link].

Suppose we have some set S of balls, and we wish to choose a ball X at random.  Mathematically, we model this as a probability space (S, F, P), where S is the set of balls, F is a sigma-algebra consisting of certain subsets of S, called events, and P : F-> [0,1] is a probability measure, where for a set E in F, P(E) is the probability that the ball X we chose lies in E.

If S is countable, say S = N = {1,2,3,...}, then we can specify P by picking any sequence of nonnegative numbers p1,p2,... with [sum] pk = 1.  We can set P({k}) = pk, and by [link=http://en.wikipedia.org/wiki/Measure_%28mathematics%29#Definition]countable additivity[/link], this uniquely determines P(E) for any subset E of S.  So in this case, knowing the probability measure is equivalent to knowing the probability of picking any given ball.

But we can't pick a ball uniformly at random: we need [sum] pk = 1, and this is impossible if the pk are all equal, for then the sum would either be 0 or infinity.  In fact this condition forces pk > 0 for at least one k, so ball 837 could very well have positive probability of being picked.

The situation is drastically different when S is uncountable.  For example, let S = [0,1], and suppose we want to pick a ball at random, i.e., we want to pick a random real number X between 0 and 1.  The standard thing to do is to "pick" X uniformly at random.  However, we are not actually picking X through any physical process, we are only defining a probability measure P : F -> [0,1].  That is, we know, for certain subsets E of S, the probability that X lies in the set E.

There are two ways in which this differs from the countable case: P is not uniquely determined by P({x}) for each element x of S, and P does not determine P(E) for every subset E of S.

First, if we pick X uniformly at random from the unit interval, then the probability that X is any given number is equal to the probability that X is any other given number.  But this probability cannot be positive, because then by countable additivity, the probability that X lies in any countably infinite subset of S would be infinite, a contradiction.  So the probability that X is equal to any given number is 0, but it makes perfectly good sense to say, for example, that the probability that X lies in (.1, .6) is 1/2.

More generally, if E is an interval (a,b) contained in [0,1], then P(E) = b-a.  Since F is a [link=http://en.wikipedia.org/wiki/Sigma-algebra]sigma-algebra[/link], we also know P(E) for any E which is a countable union of such intervals, and this includes every open set.  We also know the probability of the complements of these sets, which are the closed sets.  And we also know the probability of any countable union of closed sets, as well as their complements, which are countable intersections of open sets.  And we know countable unions of these, etc.  The collection of all sets we obtain in this way -- the smallest sigma-algebra containing the open sets -- is known as the Borel sigma-algebra.

So by specifying that X is chosen uniformly from the interval [0,1], we know the probability that X lies in any given Borel set.  However, it can be shown that there are only countably many Borel sets, while S has uncountably many subsets.  Is it possible to extend the measure P beyond the Borel sets?  The answer is yes.  P has an extension, called the Lebesgue measure, defined on the sigma-algebra F known as the Lebesgue subsets (one can show there are uncountably many Lebesgue subsets, hence most are not Borel).

But, assuming the axiom of choice, there are subsets, such as the [link=http://en.wikipedia.org/wiki/Vitali_set]Vitali set[/link] V, which are not Lebesgue-measurable.  That is, if we pick a number X uniformly at random between 0 and 1, it does not make sense to ask what the probability is that X lies in V.

There's another way to think of the uniform distribution on [0,1] that may shed some light.  To determine X, flip a coin for each binary digit in the expansion of X.  Of course, you won't actually know X at any finite time.  But suppose some number A is given, via some infinite binary expansion.  Then it is possible to know, after a finite number of flips, whether X<A or X>A.  The only exception would be if the the digits match A for every single flip, or possibly if from some point on you get all 0s or all 1s, but none of these events will ever "happen".  So this is a physical procedure that may actually be carried out (if you know arbitrarily many digits of A, and have an arbitrarily large amount of time on your hands), and so P(X<A) makes sense, and similarly it makes sense to ask what the probability is that X lies in any given open subset of [0,1].  Sort of.

Now, clearly it's "possible" to pick X=0, but only if your coin never comes up tails after infinitely many flips.  One way to put it is that if something has probability 0 of occuring, it doesn't mean it can't happen, just that it won't.  The problem with this way of thinking, though, is that you're trying to attach a physical model to a mathematical model, when the whole point of introducing the mathematical model was to model something non-physical.  Or, that probability doesn't exist to tell you what can or will happen in any physical sense; it just assigns numbers to certain sets in a certain way.

Title: Re: Infinite Balls Urn
Post by Eigenray on Jan 9th, 2007, 6:41pm
Suppose at stage k, we add ck balls, and then remove one at random, and let dk = c1+...+ck, and then the probability that ball n is never removed is

pn = prod{d_k >= n} (1 - 1/[dk-k+1]).

Thus if ck is constant or logarithmic, for example, then the urn will be empty with probability 1, and the expected number of balls left is 0.

But if ck grows large enough so that some ball n has positive probability of remaining, then note that every ball m > n is at least as likely to be left over, so the expected number of balls remaining is infinite.  Moreover, since pn > 0, the infinite product converges and so the tails pm -> 1 as m goes to infinity.

Now let A be the set of remaining balls.  For any fixed k, the event [A is contained in {1,...,k}] is contained in the event [m is not in A] for all m>k.  Since the latter event has probability 1-pm -> 0, it follows that P[A is contained in {1,...,k}] = 0.  Taking the union over all k, it follows P[A is finite] = 0.  That is, with probability 1, there are infinitely many balls remaining.  (In particular, it is empty with probabilty 0.)

Title: Re: Infinite Balls Urn
Post by kiochi on Jan 9th, 2007, 6:50pm
This discussion reminds me of the problematic thinking behind "Bertrand's paradox."

Given a circle C of radius R, and a concentric circle c of radius r=1/2R, what is the chance that a chord through C, drawn at random, intersects c.

Argument 1: a chord is wholly determined by it's midpoint (except if the midpoint is the center, but that has probability zero), and a chord will intersect c exactly when the midpoint is in c. c has 1/4 the area of C, so there is a 1/4 chance that a chord will hit c.

Argument 2: Let the chord go through some point p on the outside of C. The chord will be uniquely determined by the angle it makes relative to the diameter. We can draw a right triangle with right angle at the center and hypotenuse as the line through p tangent to c. Then the angle at p has sin = r/R or 1/2, so 30 degrees out of 90 (on each side of the diameter) hit c, so we get 1/3 probabability.

Argument 3: use rotational symetry to assume that the chord is vertical. Then it is uniquely determined by the point it intersects the diameter. 1/2 of the diameter lies inside c, so the probability must be 1/2.

Title: Re: Infinite Balls Urn
Post by Icarus on Jan 9th, 2007, 7:10pm
Of course, the whole secret to the "paradox" is the fact that the condition "drawn at random" is meaningless without further explanation. The three arguments choose three different probability distributions as their definitions of "at random". The result is three different probabilities.

Title: Re: Infinite Balls Urn
Post by kiochi on Jan 9th, 2007, 7:17pm
Right. I think this sort of problem helped motivate the formalization of probability to include probability spaces so that thes ambiguities don't arise.



Powered by YaBB 1 Gold - SP 1.4!
Forum software copyright © 2000-2004 Yet another Bulletin Board