wu :: forums (http://www.ocf.berkeley.edu/~wwu/cgi-bin/yabb/YaBB.cgi)
riddles >> putnam exam (pure math) >> Statistics problem
(Message started by: BenVitale on Feb 1st, 2008, 12:40pm)

Title: Statistics problem
Post by BenVitale on Feb 1st, 2008, 12:40pm
Let X1,X2,.....,Xn denote a random sample from the uniform distribution on the interval (K,K+1).

Let r1=Xbar -1/2, and r2=Xn -n/(n+1)

Show that r1 and r2 are unbaised estimators of K.

Title: Re: Statistics problem
Post by Eigenray on Feb 1st, 2008, 8:37pm
Do you mean r2 = max{Xi} - n/(n+1) ?

Title: Re: Statistics problem
Post by BenVitale on Feb 2nd, 2008, 1:53am
I Assume that Xbar = 1/n *SUM X_j
and X_n = max{X_j}.

In both cases all we need is to show that the expectation of r1, r2 equals K

Title: Re: Statistics problem
Post by Icarus on Feb 2nd, 2008, 10:06am

on 02/02/08 at 01:53:33, BenVitale wrote:
I Assume that ...
and X_n = max{X_j}.


Since "X_n" also represents one of the X_j, this is not a good choice of notation, unless you mean that X_1 <= X_2 <= ... <= X_n.

Title: Re: Statistics problem
Post by pex on Feb 2nd, 2008, 10:30am
What's this doing in Putnam?

r1 is very easy. E[Xi] = K + 1/2 for all i, so
E[Xmean] = (1/N) * N * (K + 1/2) = K + 1/2
and hence E[r1] = K.

r2 is standard too, but it requires a bit more work. The cumulative distribution function of each of the Xi is
F(x) = Pr(Xi<x) = {0 if x<K, x - K if K<x<K+1, 1 if K+1<x}.

The cumulative distribution function of the maximum is easy to find: we have
Pr(Xmax<x) = Pr(all Xi<x) = F(x)n.

This means that the density function is n * F(x)n-1 * F'(x). We'll only need it on the interval (K, K+1), where it equals
n * (x - K)n-1 * 1 = n(x-K)n-1.
Then, we can find the expectation of Xmax as
integral(x from K to K+1) x*n(x-K)n-1 dx
= n*integral(t from 0 to 1) (t+K)tn-1 dt
= n*integral(t from 0 to 1) tn dt + n*K*integral(t from 0 to 1) tn-1dt
= n*1/(n+1) + n*K*(1/n)
= n/(n+1) + K.
Thus, E[r2] = K as well.

Title: Re: Statistics problem
Post by BenVitale on Feb 3rd, 2008, 9:34pm
Do you know which variable (r1 or r2) has the smaller variance. I calculated Var(r1)=1/12 however I'm pretty sure this is incorrect since this number is independent of n.

Title: Re: Statistics problem
Post by Eigenray on Feb 3rd, 2008, 9:59pm
For independent variables Var(X+Y) = Var(X) + Var(Y), and Var(c X) = c2 Var(X).  So Var(r1) = http://www.ocf.berkeley.edu/~wwu/YaBBImages/symbols/sum.gif Var(Xi/n) = n*Var(Xi)/n2 = 1/(12n).

On the other hand,

Var(r2) = http://www.ocf.berkeley.edu/~wwu/YaBBImages/symbols/int.gif01  (x-n/(n+1))2 nxn-1dx = n/[(n+1)2(n+2)],

which is less than 1/(12n) for n http://www.ocf.berkeley.edu/~wwu/YaBBImages/symbols/ge.gif 8.

Title: Re: Statistics problem
Post by BenVitale on Feb 5th, 2008, 10:40am
it's interesting that r2 has a smaller variance than r1 for large n. Usualy things involving Xbar are better estimators.

Let E(x/y) denote the expected value of x given y.

How can we prove that E(xy)=E(yE(x/y))?

Title: Re: Statistics problem
Post by pex on Feb 5th, 2008, 10:56am

on 02/05/08 at 10:40:50, BenVitale wrote:
Let E(x/y) denote the expected value of x given y.

How can we prove that E(xy)=E(yE(x/y))?

By conditioning, also known as the "Law of Iterated Expectations": generally, E(A) = E( E(A|B) ).

E(XY) = E( E(XY|Y) ) = E( Y E(X|Y) ), where the first equality uses the LIE and the second follows from the fact that given Y, the expected value of XY is just E(X|Y) * Y.



Powered by YaBB 1 Gold - SP 1.4!
Forum software copyright © 2000-2004 Yet another Bulletin Board