Author 
Topic: Char poly of binomial matrix (Read 3401 times) 

Eigenray
wu::riddles Moderator Uberpuzzler
Gender:
Posts: 1948


Re: Char poly of binomial matrix
« Reply #25 on: Mar 22^{nd}, 2007, 11:34am » 
Quote Modify

on Mar 22^{nd}, 2007, 6:47am, Barukh wrote:That doesn't help to find the multiplicites in our case, does it? 
 I'm not sure what you mean: we know the minimal polynomial of A divides t^{3}1, so it has distinct roots. This implies all Jordan blocks of A are 1x1, i.e., A is diagonalizable. The multiplicity of an eigenvalue in the characteristic polynomial is the sum of the sizes of the Jordan blocks corresponding to it, but this is just the number of times it appears in the diagonalization of A. Quote: Another equation would be b = b’ (because , ^{2} are kind of conjugate) 
 This reasoning would be enough in the case p=2 mod 3, because then F_{p}, so the Frobenius automorphism fixes F_{p} (hence the characteristic polynomial) but interchanges and ^{2}. But in the case p=1 mod 3, a different argument is needed. For example, there are matrices with entries in F_{7}, with eigenvalues 1, 2, 4, of arbitrary multiplicities. Quote:, and the last is: ab = Tr(A) mod p. The latter turns out to be equal to (1)^{p mod 3}. This gives the answer. It remains then to prove the last identity. 
 Exactly. We have of course Tr(A) = _{k=0}^{p1} C(2k, k). As a sort of hint, what is _{k=0} C(2k, k) x^{k}? Why?


IP Logged 



Barukh
Uberpuzzler
Gender:
Posts: 2276


Re: Char poly of binomial matrix
« Reply #26 on: Mar 23^{rd}, 2007, 8:25am » 
Quote Modify

on Mar 22^{nd}, 2007, 11:34am, Eigenray wrote: I'm not sure what you mean: we know the minimal polynomial of A divides t^{3}1, so it has distinct roots. 
 I meant that using all this information didn't show (at least to me) the way to calculate the multiplicites. Quote: This implies all Jordan blocks of A are 1x1, i.e., A is diagonalizable. 
 This takes me once again to the point we were discussing earlier and that is still not completely clear for me (maybe, I simply misunderstand the concept of diagonalization?). Take p = 5. The eigenvalues of A(5) will be, as we know, 1, , , ^{2}, ^{2}, and these same values will be at the diagonal of the diagonalization of A. Now, 4 out of 5 values are not in Z/5. This clear is not in line with you statement earlier in this thread: Quote: that a matrix is diagonalizable (over some field F) iff its minimal polynomial splits into distinct linear factors over F. 
 But later you mentioned that “over F” should be replaced by “over some extension of F”. Which extension of Z/5 can we take in this case? Quote:But in the case p=1 mod 3, a different argument is needed. For example, there are matrices with entries in F_{7}, with eigenvalues 1, 2, 4, of arbitrary multilicities. 
 Yes… I felt something is missing here. Quote:As a sort of hint, what is _{k=0} C(2k, k) x^{k}? Why? 
 It equals (14x)^{1/2} (because of a nice identity involving convolution of central binomial coefficients).

« Last Edit: Mar 23^{rd}, 2007, 8:27am by Barukh » 
IP Logged 



Eigenray
wu::riddles Moderator Uberpuzzler
Gender:
Posts: 1948


Re: Char poly of binomial matrix
« Reply #27 on: Mar 23^{rd}, 2007, 1:36pm » 
Quote Modify

on Mar 23^{rd}, 2007, 8:25am, Barukh wrote: I meant that using all this information didn't show (at least to me) the way to calculate the multiplicites. 
 It wasn't really intended to, it was just to show how the minimal polynomial can be read off from the Jordan normal form; in this case, knowing the minimal polynomial gives you information about the Jordan normal form. Quote:Take p = 5. The eigenvalues of A(5) will be, as we know, 1, , , ^{2}, ^{2}, and these same values will be at the diagonal of the diagonalization of A. Now, 4 out of 5 values are not in Z/5. 
 So A(5) is not diagonalizable over F_{5}=Z/5. But it is diagonalizable over F_{25} = F_{5}[], a degree 2 extension of F_{5}, obtained by adjoining , a solution to x^{2}+x+1=0. That is, F_{25} is isomorphic to F_{5}[x]/(x^{2}+x+1), where corresponds to the equivalence class of x. Explicitly, over this field, a basis of eigenvectors is (1, 3, 3, 3, 1), (1, 1, 1+2, 0, ^{2}) (0, 1, 2, 1+, 0) (1, 1, 1+2^{2}, 0, ) (0, 1, 2^{2}, 1+^{2}, 0), with eigenvalues 1, , , ^{2}, ^{2}, respectively. Quote:This clear is not in line with you statement earlier in this thread: a matrix is diagonalizable (over some field F) iff its minimal polynomial splits into distinct linear factors over F 
 That statement is true. Over F_{5}, the minimal polynomial factors into irreducibles as (x1)(x^{2}+x+1). It doesn't split into linear factors, but it's still separable, because over the field F_{5}[], it factors into distinct linear factors (x1)(x)(x^{2}). So the matrix is diagonalizable over F_{5}[], but not over F_{5}. Quote:But later you mentioned that “over F” should be replaced by “over some extension of F”. 
 When I said that I was referring to Quote:x^{3}1, which has distinct roots (of degree no more than 2) over F_{p} iff p3 
 What I meant was, if you consider x^{3}1 as a polynomial, not over Q, but with coefficients in F_{p}, then it "has distinct roots" iff p3. And when I say "has distinct roots," I mean it's separable: it splits into distinct linear factors over some extension of the field containing its coefficients. In this case, the splitting field has degree 1 or 2 over F_{p}. I guess I should have used the term "separable" because it depends only on the polynomial (compared to the question "does f have a root", which depends on the field you're working in). That is, if f has coefficients in some field F K, then f is separable, viewed as a polynomial over F, iff it's separable as a polynomial over K. (Although the term separable is also used to mean only that each irreducible factor has distinct roots in some extension  compare planetmath to wikipedia  and with this usage, it does depend on the field.) Quote:Yes… I felt something is missing here. 
 What do you know about A and A^{1}? Quote:It equals (14x)^{1/2} (because of a nice identity involving convolution of central binomial coefficients). 
 The other way to think of it is because (14x)^{1/2} = C(1/2, k) (4x)^{k} = C(2k, k) x^{k}

« Last Edit: Mar 23^{rd}, 2007, 2:03pm by Eigenray » 
IP Logged 



Barukh
Uberpuzzler
Gender:
Posts: 2276


Re: Char poly of binomial matrix
« Reply #28 on: Mar 25^{th}, 2007, 6:24am » 
Quote Modify

Quote:The other way to think of it is because (14x)^{1/2} = C(1/2, k) (4x)^{k} = C(2k, k) x^{k} 
 Oh, I see. That’s a very simple demonstration. Here’s how I did it (actually, using the approach taken in the aforementioned paper). I started from the following identity: 1x3x…(2k1) = 2^{k}(k+1)(k+2)…(2k) which is simply demonstrated by induction. Now, replace every term i at the left with (pi), and take modulo p: (p1)(p3)…(p2k+1) = (2)^{k}(k+1)(k+2)…(2k) mod p Divide both sides by k!2^{k} to get C((p1)/2, k) = C(2k, k)(4)^{k} mod p. Therefore (the summation is from 0 to (p1)/2): Tr(A) = C(2k,k) = C((p1)/2, k)(4)^{k} = (binom!) = (3)^{ (p1)/2} mod p Using the results from quadratic residues theory, we obtain: (3)^{ (p1)/2} mod p = (3  p) = (p  3) = (1)^{p mod 3}, where (xy) is Legendre symbol. Eigenray, I apologize for this lengthy demonstration, yours is much more concise, but I thought that way more people will follow the discussion. I must say that I really enjoyed working on this problem!

« Last Edit: Mar 25^{th}, 2007, 6:39am by Barukh » 
IP Logged 



Eigenray
wu::riddles Moderator Uberpuzzler
Gender:
Posts: 1948


Re: Char poly of binomial matrix
« Reply #29 on: Mar 25^{th}, 2007, 2:42pm » 
Quote Modify

on Mar 25^{th}, 2007, 6:24am, Barukh wrote:Using the results from quadratic residues theory, we obtain 
 We don't actually need quadratic reciprocity here (or rather, the special case we need has a short proof). For p > 3 an odd prime, 3 is a square mod p iff the equation x^{2} + x + 1 = 0 has a solution in F_{p} (since (2x+1)^{2} = 3). But any solution will be a primitive 3rd root of unity; since F_{p}^{*} is cyclic, this exists iff p = 1 mod 3. So (3p) = (p3). Quote:I must say that I really enjoyed working on this problem! 
 And thank you for working on it. But it's not done yet! So, what is the relation between A(p) and A(p^{k}), that led me to this question in the first place?


IP Logged 



Barukh
Uberpuzzler
Gender:
Posts: 2276


Re: Char poly of binomial matrix
« Reply #30 on: Mar 29^{th}, 2007, 10:10am » 
Quote Modify

on Mar 25^{th}, 2007, 2:42pm, Eigenray wrote:But it's not done yet! So, what is the relation between A(p) and A(p^{k}), that led me to this question in the first place? 
 Maybe, the following is relevant (assuming k =2 for a moment) : From the previously mentioned property, it follows that A_{mp+i,np+q}(p^{2}) = A_{mn}(p) A_{ij}(p) Putting it differently, A(p^{2}) may be divided into p^{2} blocks B_{ij} of size pxp each, so that B_{ij} = A(p)A_{ij}(p). I learned just today that this is called matrix direct, or Kronecker product of matrix A(p) with itself. The resulting matrix has eigenvalues equal to the Cartesian product of eigenvalues of the base matrices. In our case, eigenvalues of A(p) form a group under multiplication modulo p, therefore A(p^{2}) will have the same eigenvalues. Their multiplicities are easily computed. This is easily extended to any natural k.


IP Logged 



Eigenray
wu::riddles Moderator Uberpuzzler
Gender:
Posts: 1948


Re: Char poly of binomial matrix
« Reply #31 on: Mar 30^{th}, 2007, 7:07pm » 
Quote Modify

on Mar 29^{th}, 2007, 10:10am, Barukh wrote:Putting it differently, A(p^{2}) may be divided into p^{2} blocks B_{ij} of size pxp each, so that B_{ij} = A(p)A_{ij}(p). 
 Yes, precisely. When p=2, this says that if A=A(q), then A(2q) is [ A A ] [ A O ], which is exactly how Sierpinski's triangle is formed. (In fact, in this case [A(2^{k}) mod 2] and [A(2) mod 2]...[A(2) mod 2] are equal as integer matrices, not just mod 2.) Quote:I learned just today that this is called matrix direct, or Kronecker product of matrix A(p) with itself. The resulting matrix has eigenvalues equal to the Cartesian product of eigenvalues of the base matrices. 
 Moreover, if v, w, are eigenvectors of A, B, with eigenvalues , , respectively, then vw is an eigenvector of AB with eigenvalue . And if {v_{i}}, {w_{j}} are bases of V, W, respectively, then {v_{i}w_{j}} is a basis of VW. So we can even give an eigenbasis for A(p^{k}) (for p 3). Another way to see this: if UAU^{1}, VBV^{1} are diagonal (or at least uppertriangular), then (UV)(AB)(UV)^{1} = (UAU^{1})(VBV^{1}) is also diagonal (or uppertriangular  so we can still read off the eigenvalues).


IP Logged 



