wu :: forums (http://www.ocf.berkeley.edu/~wwu/cgi-bin/yabb/YaBB.cgi)
riddles >> hard >> Littlewood's Number game
(Message started by: Icarus on Dec 16th, 2002, 7:29pm)

Title: Littlewood's Number game
Post by Icarus on Dec 16th, 2002, 7:29pm
You are one of two contestants in the following game: A umpire chooses two consecutive positive integers entirely at random and writes the two numbers on slips of paper, which he then hands out randomly to the two players. Each looks at their number and either agrees or disagrees to play. If both players agree, the person with the higher number must pay that many dollars to their opponent.
You only agree to play when the expected payout favors you. Obviously, you would agree if your number was 1. For what other values should you agree to play?

Assume infinite resources for payouts. I.e. it does not matter how high the numbers are, the payment can be made.

Note: This puzzle was posed by Martin Gardner, and is a variant of one published by J.E. Littlewood.

(Modified: I reversed the payoff in the original post: the player with the lower number paid out to the one with the higher. The first three posts following were in response to this erroneous puzzle.)

Title: Re: Littlewood's Number game
Post by jeremiahsmith on Dec 17th, 2002, 12:31am
You can get an answer similar to the Surprise Quiz (pop quiz can't be on Friday, so we're not having it), if you think about it:

If player Willy gets 1, his opponent Jerry must have 2. Since Jerry will win if he agrees to play, Willy decides not to play if he gets 1.

If Willy gets 2, Jerry must either have 1 or 3. If Jerry has 3, Willy loses. If Jerry has 1, he won't bother to play (see previous paragraph) and Willy gets nothing. Either way, it's pointless to play if you have a 2.

If Willy gets 3, Jerry must either have 2 or 4. If Jerry has 4, Willy loses. If Jerry has 2, he won't bother to play (see previous paragraph) and Willy gets nothing...

Thus, for whatever number n > 1 you get, it's pointless to play, since if your opponent has n + 1, you lose, and if your opponent has n - 1, he won't bother to play and you get nothing. :D

This is obviously not the intended answer, but I thought I'd bring it up :D

Title: Re: Littlewood's Number game
Post by towr on Dec 17th, 2002, 1:36am
I'd have to agree with that..

The expected payout never favors you. For any n>1 it's (n-1) if you win, minus n if you loose, so expectancy is -1..
Chances of winning or loosing are the same unless you have 1, in which case you can only loose.. So expectancy is actually slighly lower..

Title: Re: Littlewood's Number game
Post by James Fingas on Dec 17th, 2002, 11:23am
This is an interesting puzzle. I'm seeing little bits of the "x vs 2x" problem, and a bit of the pop quiz problem.

If we assume you have a number greater than 1, then your expectation is that you have a 50% chance of winning (seems reasonable, right?) Then if you play, you could win $(n+1) or you could lose $n:

0.5*$(n+1) - 0.5*$n = $0.5

What's funny is that your opponent also expects to win half a dollar--but the game is a zero-sum game! Obviously your expectation should be zero, or it should go negative for sufficiently high n.

I think one of the key points here is "the umpire chooses two consecutive positive integers at random". How can you choose completely randomly from an infinitely-sized set? You can't! (not uniformly anyhow) Therefore, he chooses according to some distribution (the pdf of which goes to zero as n goes to infinity), and this decrease over n of the pdf will tell you which numbers are good bets and which are bad bets. For instance, here is a distribution which gives an expectation of $2 when you get a 1, an expectation of zero for all numbers up to 9, and an expectation of -$10 if you get a 10:

p(1,2) = 0.2592
p(2,3) = 0.1728
p(3,4) = 0.1296
p(4,5) = 0.1037
p(5,6) = 0.0864
p(6,7) = 0.0741
p(7,8) = 0.0648
p(8,9) = 0.0576
p(9,10) = 0.0518
all other probabilities zero.

The probabilities I used here are just the inverses of the second number, and then I scaled them so that they summed to 1. You can see that for any probability distribution, there exist numbers that have an expected negative payout, and some which have an expected positive payout. Without knowing the distribution, however, you can't calculate which numbers have a positive payout and which have a negative payout.

Title: Re: Littlewood's Number game
Post by Icarus on Dec 17th, 2002, 4:22pm

on 12/17/02 at 01:36:00, towr wrote:
I'd have to agree with that..

The expected payout never favors you. For any n>1 it's (n-1) if you win, minus n if you loose, so expectancy is -1..
Chances of winning or loosing are the same unless you have 1, in which case you can only loose.. So expectancy is actually slighly lower..


My apologies - I rephrased the puzzle some, and in the process inadvertently reversed the payoff. The last part should read:

The player with the higher number must pay that amount to other player.

This also removes Jeremiah's objection (and my own statement that you do not want to play if you have 1). I have modified the original post.

Title: Re: Littlewood's Number game
Post by Jeremiah Smith on Dec 17th, 2002, 6:40pm
Darn. I was so happy to finally have a decent post in the Hard forum...

Ummm...

Okay, if you get infinity, you won't play.
If you get infinity - 1, if the other guy has infinity - 2, you lose, and if the other guy has infinity, he won't play...

:D

Title: Re: Littlewood's Number game
Post by James Fingas on Dec 18th, 2002, 7:17am
Jeremiah,

But of course you'd want to play if you got infinity! Any chance to get an infinite payout is worth an infinite amount of money--just think, if you won, you'd be set for life! Never mind that stupid Club Med--just buy the Mediterranean  ;D

And if you lost, you could just declare bankruptcy...

I feel for you though--your post was very good!

Title: Re: Littlewood's Number game
Post by That Don Guy on Dec 19th, 2002, 12:25pm
If you have N > 1, then your opponent has either N-1, in which case you pay N, or N+1, in which case you collect N+1.

The selection of numbers is, in effect, (a) select a positive integer N and (b) select which player gets N and which one gets N+1.  The probability of you having N and your opponent N-1 is X * 1/2, where X is the probability of drawing N in (a) (and it is assumed X is a constant for all N); the probability of you having N and your opponent N+1 is also X * 1/2.  Therefore, each possibility is equally likely for all N > 1, and since in each case the expected gain is > 0, you should always play.

(I almost tried to solve a different problem, where the numbers were not consecutive and you paid/collected the difference in the two numbers; as it turns out, the answer there seems to be "always play" as well...)

Title: Re: Littlewood's Number game
Post by James Fingas on Dec 19th, 2002, 1:58pm
Don Guy,

That's a reasonable analysis, but think about it this way: obviously the game is a zero-sum game, because there are only two of you playing, and no money coming in from anywhere else. Since neither of you has an advantage (by symmetry), the actual expectation for the game must be zero. If you calculate a positive expectation, you are deceiving yourself.

Since the calculation you did gives a positive expectation, there is a problem with your calculation. My question is: what is wrong with your calculation? At first glance, it seems fine--the problem is quite subtle.

Title: Re: Littlewood's Number game
Post by towr on Dec 20th, 2002, 12:53am
hmmm..
I guess you should only look at pairs:
for pair (n, n+1), you have 1/2 chance of having n, and 1/2 chance of n+1. So you either get n+1, or pay n+1, expectancy=0 .

Title: Re: Littlewood's Number game
Post by fenomas on Dec 20th, 2002, 1:51am
I think I'm close to this one, or at any rate I've found the problem with DonGuy's reasoning.

If you draw a one, of course you play. If you draw 2, it seems like a fair bet: lose 2 or win 3. But that assumes that your opponent is equally likely to play with a 1 or a 3. The crucial point is that either player can cancel any round-- so if your opponent never plays threes, then obviously playing with a two will be a losing proposition- you'll either pay $2 or the round will cancel. Since your opponent will always play if he draws a 1, no matter what strategy he uses he will play on a 3 as often or less often than he plays with a one. So playing on a two doesn't seem so good after all.

It's tempting to think that your return in a round where you draw N is (N+1)/2 - (N)/2, but a better way to think of it might be ON+1(N+1)/2 - ON-1(N)/2, where ON is the odds that your opponent will play if he draws an N.  In fact, I guess the return for drawing N should properly be what I just said, times your own probability of playing YN, since you can cancel the game as well.

What I can't get around is that the game is entirely parallel, so any strategy you use can be used by your opponent as well.  Suppose there is a God strategy G that maximizes your return where, drawing N, you will choose to play with a probability of GN. Assume your opponent is so smart that he knows the strategy too.  Then, in the case of drawing N, your return is:

GN * (  GN+1(N+1)/2 - GN-1(N)/2  )     - - since ON is also GN
= ( (N+1)*GN*GN+1 - N*GN*GN-1 )/2

Then for this return to be greater than 0, you need (N+1)*GN*GN+1 > N*GN*GN-1, and thus GN+1/GN-1 > N/(N+1), which implies that... um, if G1=1, then G3 > 2/3?  I don't think I'm making progress here. I expected to find that there were losing strategies, and at least one that was guaranteed to break even, but I'm getting lost... Can someone steer me on the right track, or finish this off, or something?

Title: Re: Littlewood's Number game
Post by James Fingas on Dec 20th, 2002, 9:16am
fenomas,

Don't get fooled by the "opting-in". The error in calculation is entirely independent of the opting-in or opting-out of the contestants.

To show this, just consider the game in which you can't opt out. The calculation still applies to that variant.

Maybe I was on the right track in my first post to this thread...

Title: Re: Littlewood's Number game
Post by fenomas on Dec 20th, 2002, 11:47pm
James- I see what you mean about why something else is wrong with DonGuy's "you should always play" reasoning. After Towr's post (didn't see it before, he entered it while I was writing my novel of a post), and the Envelope Gamble thread, I think I understand what was wrong with it.. please see my post in that thread.

But for this puzzle, I still think the opponent's strategy must come into play. If it doesn't, you reduce him to a dealer who always plays, and he need not be in the puzzle.

Any way you look at it, though, you have two identical players (playing simultaneously, so this can't be a nim-like problem). The problem says nothing about your opponent's strategy, so you have to assume he uses the best possible strategy, which is naturally what you'll be doing as well. So I don't see how any possible strategy could hope to do better than ensure breaking even. And if the best possible strategy ensures breaking even then you can get the same expected return by not playing at all!

(first pp edited after reading the envelope gamble thread)

Title: Re: Littlewood's Number game
Post by SWF on Sep 5th, 2003, 6:44pm
As already mentioned by others, this is a zero-sum game. Prior to the selection of any numbers, both players have the same chance of winning. Therefore, before any numbers are assigned, expectation is zero.

Next step is to evaluate the best strategy for a player after he finds out which number he has. One might expect this strategy to depend on the probability of each number being chosen, as well as whether other players agree to play.

First, calculate the expected value for each player assuming everyone always agrees to play after receiving their numbers. It was given that the numbers were chosen randomly, but not the details, so this result will be given in terms of an arbitrary probability distribution. Let pN be the probability that the lower of the two numbers is N, with p0=0. The probability of a given player receiving number N is = pN+pN-1 which is the sum of probability of him having the lower number (pN) and probability of him having the higher number (pN-1). Given that he has number N, probability of him winning $(N+1) is pN/(pN+pN-1), and probability of losing $N is pN-1/(pN+pN-1). Therefore, given that a person has number N, his expectation is ((N+1)*pN-N*pN-1)/(pN+pN-1).

Assuming everyone always agrees to play after receiving their numbers, is it possible for the expectation of everyone with a value of N greater than some value M to have a positive or break even expectation? For this to be, pN[ge]pN-1*N/(N+1) for all N>M. This would mean that for any k>0, pN+k[ge]pN*(N+1)/(N+1+k). Adding up the pN+k for k=1 to infinity gives a divergent series (tail end of the harmonic series). If pN>0 the sum of these probabilities is greater than 1. This is not possible.

Now take into account that your opponent may not agree to play. If there is no way you can ever win, there is nothing to be gained by agreeing to play after seeing your number. You can never win if you have N-1 and you know somebody with N would never play. If somebody with N wouldn't play, a reasonable player with N-1 would not play, so somebody with N-2 should not play. And so on down through all values below N (except for N=1 who cannot lose, but cannot ever expect to win either).

Therefore, if somebody is given the number N (i.e. pN>0), he should only consider playing if everyone with a number >N agrees to play. Assuming everyone uses the best strategy, for some number >N somebody would not agree to play. This is because if everyone did agree to play, as shown above, this would imply an impossible probability distribution or somebody with negative expectation agreeing to play (contradicting assumption that everyone uses the best strategy). Therefore somebody given number N has nothing to gain by agreeing to play.

Executive Summary: If everyone uses the best strategy, it is never to anyone's advantage to play the game after seeing his number. This is independent of the probability distribution used to select the numbers.

Title: Re: Littlewood's Number game
Post by Icarus on Sep 5th, 2003, 9:15pm
That's the way to bypass the point of a puzzle! ;)

It sounds like an interesting analysis, but it's been too long for me, and you've neatly sidestepped the key issue behind this puzzle (which I'm sure you know what is, since you sidestepped it intentionally). I need to think on it for awhile.

Title: Re: Littlewood's Number game
Post by SWF on Sep 12th, 2003, 7:55pm
I don't know what issue you think has been sidestepped. I suppose you mean that the manner of selecting the values at random is not given. The neat thing about this problem, is that it turns out not to matter how it was chosen rather than be an underspecified question. I assumed that some working probability distribution was used to select the numbers, but assuming otherwise would contradict the question.

Sort of like the question, "A real number, x, is chosen at random.  What is the value of sin2(x)+cos2(x)?". Usually these 'hidden-method-of-picking-random-number' type questions, like Random Chord (http://www.ocf.berkeley.edu/~wwu/cgi-bin/yabb/YaBB.cgi?board=riddles_easy;action=display;num=1061612622) are underspecified or make no sense.

Title: Re: Littlewood's Number game
Post by Icarus on Sep 14th, 2003, 1:42pm
Sorry to take so long to get back. I've been busy lately, and finding time to go over the whole thing in my mind again, and apply your analysis to it has not been easy. (Not to mention that other problem that has bedeviled me these last few days!)

The point of this puzzle in my intention was once again the impossibility of picking a natural number with uniform probability. The naive approach to this puzzle is to say: "its equally likely that I have the low number as it is that I have the high number. But I get 1 more dollar for my number being low than I pay for it being high. So the odds are in my favor!" Of course this is ridiculous because the situation is symmetric, so it can't favor either player.

My idea on the solution was simply to point out that the flaw in this reasoning was the inability to pick numbers entirely at random, therefore the original assumption that the probability of having the lower number is equal to having the higher number was wrong.

SWF's analysis goes far beyond this, showing directly that no probability distribution can ever favor the player (which we knew indirectly from the symmetry). And also showing that it is never to the player's advantage to agree. (In the case of the player having "1", he has nothing to lose by agreeing, but since the other player will definitely refuse, he has nothing to gain either.)

So, far from sidestepping the "no uniform distribution" issue, SWF has demonstrated that no distribution will work. Well done!  8)

(Now I've got to stop being lazy and update my list! :()



Powered by YaBB 1 Gold - SP 1.4!
Forum software copyright © 2000-2004 Yet another Bulletin Board