Author 
Topic: Littlewood's Number game (Read 4806 times) 

Icarus
wu::riddles Moderator Uberpuzzler
Boldly going where even angels fear to tread.
Gender:
Posts: 4863


Littlewood's Number game
« on: Dec 16^{th}, 2002, 7:29pm » 
Quote Modify

You are one of two contestants in the following game: A umpire chooses two consecutive positive integers entirely at random and writes the two numbers on slips of paper, which he then hands out randomly to the two players. Each looks at their number and either agrees or disagrees to play. If both players agree, the person with the higher number must pay that many dollars to their opponent. You only agree to play when the expected payout favors you. Obviously, you would agree if your number was 1. For what other values should you agree to play? Assume infinite resources for payouts. I.e. it does not matter how high the numbers are, the payment can be made. Note: This puzzle was posed by Martin Gardner, and is a variant of one published by J.E. Littlewood. (Modified: I reversed the payoff in the original post: the player with the lower number paid out to the one with the higher. The first three posts following were in response to this erroneous puzzle.)

« Last Edit: Dec 17^{th}, 2002, 4:34pm by Icarus » 
IP Logged 
"Pi goes on and on and on ... And e is just as cursed. I wonder: Which is larger When their digits are reversed? "  Anonymous



Jeremiah Smith
Full Member
Beep!
Posts: 172


Re: Littlewood's Number game
« Reply #1 on: Dec 17^{th}, 2002, 12:31am » 
Quote Modify

You can get an answer similar to the Surprise Quiz (pop quiz can't be on Friday, so we're not having it), if you think about it: If player Willy gets 1, his opponent Jerry must have 2. Since Jerry will win if he agrees to play, Willy decides not to play if he gets 1. If Willy gets 2, Jerry must either have 1 or 3. If Jerry has 3, Willy loses. If Jerry has 1, he won't bother to play (see previous paragraph) and Willy gets nothing. Either way, it's pointless to play if you have a 2. If Willy gets 3, Jerry must either have 2 or 4. If Jerry has 4, Willy loses. If Jerry has 2, he won't bother to play (see previous paragraph) and Willy gets nothing... Thus, for whatever number n > 1 you get, it's pointless to play, since if your opponent has n + 1, you lose, and if your opponent has n  1, he won't bother to play and you get nothing. This is obviously not the intended answer, but I thought I'd bring it up


IP Logged 



towr
wu::riddles Moderator Uberpuzzler
Some people are average, some are just mean.
Gender:
Posts: 13596


Re: Littlewood's Number game
« Reply #2 on: Dec 17^{th}, 2002, 1:36am » 
Quote Modify

I'd have to agree with that.. The expected payout never favors you. For any n>1 it's (n1) if you win, minus n if you loose, so expectancy is 1.. Chances of winning or loosing are the same unless you have 1, in which case you can only loose.. So expectancy is actually slighly lower..


IP Logged 
Wikipedia, Google, Mathworld, Integer sequence DB



James Fingas
Uberpuzzler
Gender:
Posts: 949


Re: Littlewood's Number game
« Reply #3 on: Dec 17^{th}, 2002, 11:23am » 
Quote Modify

This is an interesting puzzle. I'm seeing little bits of the "x vs 2x" problem, and a bit of the pop quiz problem. If we assume you have a number greater than 1, then your expectation is that you have a 50% chance of winning (seems reasonable, right?) Then if you play, you could win $(n+1) or you could lose $n: 0.5*$(n+1)  0.5*$n = $0.5 What's funny is that your opponent also expects to win half a dollarbut the game is a zerosum game! Obviously your expectation should be zero, or it should go negative for sufficiently high n. I think one of the key points here is "the umpire chooses two consecutive positive integers at random". How can you choose completely randomly from an infinitelysized set? You can't! (not uniformly anyhow) Therefore, he chooses according to some distribution (the pdf of which goes to zero as n goes to infinity), and this decrease over n of the pdf will tell you which numbers are good bets and which are bad bets. For instance, here is a distribution which gives an expectation of $2 when you get a 1, an expectation of zero for all numbers up to 9, and an expectation of $10 if you get a 10: p(1,2) = 0.2592 p(2,3) = 0.1728 p(3,4) = 0.1296 p(4,5) = 0.1037 p(5,6) = 0.0864 p(6,7) = 0.0741 p(7,8) = 0.0648 p(8,9) = 0.0576 p(9,10) = 0.0518 all other probabilities zero. The probabilities I used here are just the inverses of the second number, and then I scaled them so that they summed to 1. You can see that for any probability distribution, there exist numbers that have an expected negative payout, and some which have an expected positive payout. Without knowing the distribution, however, you can't calculate which numbers have a positive payout and which have a negative payout.

« Last Edit: Dec 18^{th}, 2002, 7:29am by James Fingas » 
IP Logged 
Doc, I'm addicted to advice! What should I do?



Icarus
wu::riddles Moderator Uberpuzzler
Boldly going where even angels fear to tread.
Gender:
Posts: 4863


Re: Littlewood's Number game
« Reply #4 on: Dec 17^{th}, 2002, 4:22pm » 
Quote Modify

on Dec 17^{th}, 2002, 1:36am, towr wrote:I'd have to agree with that.. The expected payout never favors you. For any n>1 it's (n1) if you win, minus n if you loose, so expectancy is 1.. Chances of winning or loosing are the same unless you have 1, in which case you can only loose.. So expectancy is actually slighly lower.. 
 My apologies  I rephrased the puzzle some, and in the process inadvertently reversed the payoff. The last part should read: The player with the higher number must pay that amount to other player. This also removes Jeremiah's objection (and my own statement that you do not want to play if you have 1). I have modified the original post.

« Last Edit: Dec 17^{th}, 2002, 4:29pm by Icarus » 
IP Logged 
"Pi goes on and on and on ... And e is just as cursed. I wonder: Which is larger When their digits are reversed? "  Anonymous



Jeremiah Smith
Full Member
Beep!
Posts: 172


Re: Littlewood's Number game
« Reply #5 on: Dec 17^{th}, 2002, 6:40pm » 
Quote Modify

Darn. I was so happy to finally have a decent post in the Hard forum... Ummm... Okay, if you get infinity, you won't play. If you get infinity  1, if the other guy has infinity  2, you lose, and if the other guy has infinity, he won't play...


IP Logged 



James Fingas
Uberpuzzler
Gender:
Posts: 949


Re: Littlewood's Number game
« Reply #6 on: Dec 18^{th}, 2002, 7:17am » 
Quote Modify

Jeremiah, But of course you'd want to play if you got infinity! Any chance to get an infinite payout is worth an infinite amount of moneyjust think, if you won, you'd be set for life! Never mind that stupid Club Medjust buy the Mediterranean And if you lost, you could just declare bankruptcy... I feel for you thoughyour post was very good!


IP Logged 
Doc, I'm addicted to advice! What should I do?



That Don Guy
Guest


Re: Littlewood's Number game
« Reply #7 on: Dec 19^{th}, 2002, 12:25pm » 
Quote Modify
Remove

If you have N > 1, then your opponent has either N1, in which case you pay N, or N+1, in which case you collect N+1. The selection of numbers is, in effect, (a) select a positive integer N and (b) select which player gets N and which one gets N+1. The probability of you having N and your opponent N1 is X * 1/2, where X is the probability of drawing N in (a) (and it is assumed X is a constant for all N); the probability of you having N and your opponent N+1 is also X * 1/2. Therefore, each possibility is equally likely for all N > 1, and since in each case the expected gain is > 0, you should always play. (I almost tried to solve a different problem, where the numbers were not consecutive and you paid/collected the difference in the two numbers; as it turns out, the answer there seems to be "always play" as well...)


IP Logged 



James Fingas
Uberpuzzler
Gender:
Posts: 949


Re: Littlewood's Number game
« Reply #8 on: Dec 19^{th}, 2002, 1:58pm » 
Quote Modify

Don Guy, That's a reasonable analysis, but think about it this way: obviously the game is a zerosum game, because there are only two of you playing, and no money coming in from anywhere else. Since neither of you has an advantage (by symmetry), the actual expectation for the game must be zero. If you calculate a positive expectation, you are deceiving yourself. Since the calculation you did gives a positive expectation, there is a problem with your calculation. My question is: what is wrong with your calculation? At first glance, it seems finethe problem is quite subtle.


IP Logged 
Doc, I'm addicted to advice! What should I do?



towr
wu::riddles Moderator Uberpuzzler
Some people are average, some are just mean.
Gender:
Posts: 13596


Re: Littlewood's Number game
« Reply #9 on: Dec 20^{th}, 2002, 12:53am » 
Quote Modify

hmmm.. I guess you should only look at pairs: for pair (n, n+1), you have 1/2 chance of having n, and 1/2 chance of n+1. So you either get n+1, or pay n+1, expectancy=0 .

« Last Edit: Dec 20^{th}, 2002, 12:55am by towr » 
IP Logged 
Wikipedia, Google, Mathworld, Integer sequence DB



fenomas
Newbie
emoticonoclast
Gender:
Posts: 20


Re: Littlewood's Number game
« Reply #10 on: Dec 20^{th}, 2002, 1:51am » 
Quote Modify

I think I'm close to this one, or at any rate I've found the problem with DonGuy's reasoning. If you draw a one, of course you play. If you draw 2, it seems like a fair bet: lose 2 or win 3. But that assumes that your opponent is equally likely to play with a 1 or a 3. The crucial point is that either player can cancel any round so if your opponent never plays threes, then obviously playing with a two will be a losing proposition you'll either pay $2 or the round will cancel. Since your opponent will always play if he draws a 1, no matter what strategy he uses he will play on a 3 as often or less often than he plays with a one. So playing on a two doesn't seem so good after all. It's tempting to think that your return in a round where you draw N is (N+1)/2  (N)/2, but a better way to think of it might be O_{N+1}(N+1)/2  O_{N1}(N)/2, where O_{N} is the odds that your opponent will play if he draws an N. In fact, I guess the return for drawing N should properly be what I just said, times your own probability of playing Y_{N}, since you can cancel the game as well. What I can't get around is that the game is entirely parallel, so any strategy you use can be used by your opponent as well. Suppose there is a God strategy G that maximizes your return where, drawing N, you will choose to play with a probability of G_{N}. Assume your opponent is so smart that he knows the strategy too. Then, in the case of drawing N, your return is: G_{N} * ( G_{N+1}(N+1)/2  G_{N1}(N)/2 )   since O_{N} is also G_{N} = ( (N+1)*G_{N}*G_{N+1}  N*G_{N}*G_{N1} )/2 Then for this return to be greater than 0, you need (N+1)*G_{N}*G_{N+1} > N*G_{N}*G_{N1}, and thus G_{N+1}/G_{N1} > N/(N+1), which implies that... um, if G_{1}=1, then G_{3} > 2/3? I don't think I'm making progress here. I expected to find that there were losing strategies, and at least one that was guaranteed to break even, but I'm getting lost... Can someone steer me on the right track, or finish this off, or something?


IP Logged 



James Fingas
Uberpuzzler
Gender:
Posts: 949


Re: Littlewood's Number game
« Reply #11 on: Dec 20^{th}, 2002, 9:16am » 
Quote Modify

fenomas, Don't get fooled by the "optingin". The error in calculation is entirely independent of the optingin or optingout of the contestants. To show this, just consider the game in which you can't opt out. The calculation still applies to that variant. Maybe I was on the right track in my first post to this thread...

« Last Edit: Dec 20^{th}, 2002, 9:23am by James Fingas » 
IP Logged 
Doc, I'm addicted to advice! What should I do?



fenomas
Newbie
emoticonoclast
Gender:
Posts: 20


Re: Littlewood's Number game
« Reply #12 on: Dec 20^{th}, 2002, 11:47pm » 
Quote Modify

James I see what you mean about why something else is wrong with DonGuy's "you should always play" reasoning. After Towr's post (didn't see it before, he entered it while I was writing my novel of a post), and the Envelope Gamble thread, I think I understand what was wrong with it.. please see my post in that thread. But for this puzzle, I still think the opponent's strategy must come into play. If it doesn't, you reduce him to a dealer who always plays, and he need not be in the puzzle. Any way you look at it, though, you have two identical players (playing simultaneously, so this can't be a nimlike problem). The problem says nothing about your opponent's strategy, so you have to assume he uses the best possible strategy, which is naturally what you'll be doing as well. So I don't see how any possible strategy could hope to do better than ensure breaking even. And if the best possible strategy ensures breaking even then you can get the same expected return by not playing at all! (first pp edited after reading the envelope gamble thread)

« Last Edit: Dec 21^{st}, 2002, 12:01am by fenomas » 
IP Logged 



SWF
Uberpuzzler
Posts: 879


Re: Littlewood's Number game
« Reply #13 on: Sep 5^{th}, 2003, 6:44pm » 
Quote Modify

As already mentioned by others, this is a zerosum game. Prior to the selection of any numbers, both players have the same chance of winning. Therefore, before any numbers are assigned, expectation is zero. Next step is to evaluate the best strategy for a player after he finds out which number he has. One might expect this strategy to depend on the probability of each number being chosen, as well as whether other players agree to play. First, calculate the expected value for each player assuming everyone always agrees to play after receiving their numbers. It was given that the numbers were chosen randomly, but not the details, so this result will be given in terms of an arbitrary probability distribution. Let p_{N} be the probability that the lower of the two numbers is N, with p_{0}=0. The probability of a given player receiving number N is = p_{N}+p_{N1} which is the sum of probability of him having the lower number (p_{N}) and probability of him having the higher number (p_{N1}). Given that he has number N, probability of him winning $(N+1) is p_{N}/(p_{N}+p_{N1}), and probability of losing $N is p_{N1}/(p_{N}+p_{N1}). Therefore, given that a person has number N, his expectation is ((N+1)*p_{N}N*p_{N1})/(p_{N}+p_{N1}). Assuming everyone always agrees to play after receiving their numbers, is it possible for the expectation of everyone with a value of N greater than some value M to have a positive or break even expectation? For this to be, p_{N}[ge]p_{N1}*N/(N+1) for all N>M. This would mean that for any k>0, p_{N+k}[ge]p_{N}*(N+1)/(N+1+k). Adding up the p_{N+k} for k=1 to infinity gives a divergent series (tail end of the harmonic series). If p_{N}>0 the sum of these probabilities is greater than 1. This is not possible. Now take into account that your opponent may not agree to play. If there is no way you can ever win, there is nothing to be gained by agreeing to play after seeing your number. You can never win if you have N1 and you know somebody with N would never play. If somebody with N wouldn't play, a reasonable player with N1 would not play, so somebody with N2 should not play. And so on down through all values below N (except for N=1 who cannot lose, but cannot ever expect to win either). Therefore, if somebody is given the number N (i.e. p_{N}>0), he should only consider playing if everyone with a number >N agrees to play. Assuming everyone uses the best strategy, for some number >N somebody would not agree to play. This is because if everyone did agree to play, as shown above, this would imply an impossible probability distribution or somebody with negative expectation agreeing to play (contradicting assumption that everyone uses the best strategy). Therefore somebody given number N has nothing to gain by agreeing to play. Executive Summary: If everyone uses the best strategy, it is never to anyone's advantage to play the game after seeing his number. This is independent of the probability distribution used to select the numbers.


IP Logged 



Icarus
wu::riddles Moderator Uberpuzzler
Boldly going where even angels fear to tread.
Gender:
Posts: 4863


Re: Littlewood's Number game
« Reply #14 on: Sep 5^{th}, 2003, 9:15pm » 
Quote Modify

That's the way to bypass the point of a puzzle! It sounds like an interesting analysis, but it's been too long for me, and you've neatly sidestepped the key issue behind this puzzle (which I'm sure you know what is, since you sidestepped it intentionally). I need to think on it for awhile.


IP Logged 
"Pi goes on and on and on ... And e is just as cursed. I wonder: Which is larger When their digits are reversed? "  Anonymous



SWF
Uberpuzzler
Posts: 879


Re: Littlewood's Number game
« Reply #15 on: Sep 12^{th}, 2003, 7:55pm » 
Quote Modify

I don't know what issue you think has been sidestepped. I suppose you mean that the manner of selecting the values at random is not given. The neat thing about this problem, is that it turns out not to matter how it was chosen rather than be an underspecified question. I assumed that some working probability distribution was used to select the numbers, but assuming otherwise would contradict the question. Sort of like the question, "A real number, x, is chosen at random. What is the value of sin^{2}(x)+cos^{2}(x)?". Usually these 'hiddenmethodofpickingrandomnumber' type questions, like Random Chord are underspecified or make no sense.


IP Logged 



Icarus
wu::riddles Moderator Uberpuzzler
Boldly going where even angels fear to tread.
Gender:
Posts: 4863


Re: Littlewood's Number game
« Reply #16 on: Sep 14^{th}, 2003, 1:42pm » 
Quote Modify

Sorry to take so long to get back. I've been busy lately, and finding time to go over the whole thing in my mind again, and apply your analysis to it has not been easy. (Not to mention that other problem that has bedeviled me these last few days!) The point of this puzzle in my intention was once again the impossibility of picking a natural number with uniform probability. The naive approach to this puzzle is to say: "its equally likely that I have the low number as it is that I have the high number. But I get 1 more dollar for my number being low than I pay for it being high. So the odds are in my favor!" Of course this is ridiculous because the situation is symmetric, so it can't favor either player. My idea on the solution was simply to point out that the flaw in this reasoning was the inability to pick numbers entirely at random, therefore the original assumption that the probability of having the lower number is equal to having the higher number was wrong. SWF's analysis goes far beyond this, showing directly that no probability distribution can ever favor the player (which we knew indirectly from the symmetry). And also showing that it is never to the player's advantage to agree. (In the case of the player having "1", he has nothing to lose by agreeing, but since the other player will definitely refuse, he has nothing to gain either.) So, far from sidestepping the "no uniform distribution" issue, SWF has demonstrated that no distribution will work. Well done! (Now I've got to stop being lazy and update my list! )


IP Logged 
"Pi goes on and on and on ... And e is just as cursed. I wonder: Which is larger When their digits are reversed? "  Anonymous



