Author 
Topic: Sum of Matrices (9/3/2002) (Read 1178 times) 

william wu
wu::riddles Administrator
Gender:
Posts: 1291


Sum of Matrices (9/3/2002)
« on: Sep 3^{rd}, 2002, 6:43pm » 
Quote Modify

Suppose a set of m real nbyn matrices has a nonsingular (invertible) sum. Show that some subset of at most n of those matrices also has a nonsingular sum. (Not an easy problem.)

« Last Edit: Sep 16^{th}, 2002, 1:57pm by william wu » 
IP Logged 
[ wu ] : http://wuriddles.com / http://forums.wuriddles.com



Pietro K.C.
Full Member
Gender:
Posts: 213


Re: Sum of Matrices
« Reply #1 on: Sep 6^{th}, 2002, 12:18pm » 
Quote Modify

Still working on it... didn't have a chance to really get going yet. For the case n = 2, let us represent a matrix by (v1, v2), where v1 and v2 are 2dimensional column vectors. For instance, if a matrix A is given by / a b \   \ c d / then v1 = (a, c) and v2 = (b, d). We are going to induce on m, the number of matrices in the sum. For m = 1, the matrix itself is nonsingular, so the theorem is true. Now let us suppose that the m matrices A1, ..., Am sum to a nonsingular matrix N, so that: (v(11), v(21)) + ... + (v(1m), v(2m)) = (n1, n2) where n1 is not a multiple of n2 and neither one is zero. Let Am be the matrix of greatest rank among A1, ..., Am; if its rank is two, then it alone is a nonsingular matrix, and we are done. If its rank is zero, all of A1, ..., Am are the zero matrix and cannot therefore sum to N. We can also assume that none among A1, ..., Am are the zero matrix, since that would reduce the problem to the previous case, and the induction hypothesis takes care of it. Let Am's rank be one, and without loss of generality let v(1m) be nonzero. Let c be the real number such that v(2m) = c*v(1m) (we know that v(1m) is nonzero). If there were no other matrix Ak such that v(1k) was not a multiple of v(1m), the sum A1 + ... + Am cannot be singular; hence there is at least one such matrix. If Ak + Am is nonsingular, we are done. It is singular if and only if v(2k) = c*v(1k) for the same c above. We then divide the set {A1, ..., A(m1)} into 2 categories: the matrices Ai for which v(1i) is a multiple of v(1m), and those for which it isn't. As established above, the second category is of the form (v(1i), c*(v1i)), and the first is (k*(v1m), t*(v1m)). If there is some Ai in the first category such that t/k is different from c, then this Ai summed with some Aj in the second category will be nonsingular, as can be easily checked. If there is not, let {(k1*v(1m), t1*v(1m)), ..., (ks*v(1m), ts*v(1m))} be the first category and {(w1, c*w1), ..., (wr, c*wr)} be the second. The sum of both plus Am is the matrix: ((1 + sum k)v(1m) + sum w, (c + sum t)v(1m) + c(sum w)). However, multiplying the first component by c yields the second, so this matrix is not singular. Hence, at least one of our suppositions is wrong, and they were all necessary for the nonexistence of a pair of matrices with nonsingular sum. Which means there exists such a pair, as we wished to prove. I know this proof is still clumsy, and worse yet, is just for n = 2, but maybe it will help someone figure out a better way.

« Last Edit: Sep 6^{th}, 2002, 12:27pm by Pietro K.C. » 
IP Logged 
"I always wondered about the meaning of life. So I looked it up in the dictionary under 'L' and there it was  the meaning of life. It was not what I expected." (Dogbert)



James Fingas
Uberpuzzler
Gender:
Posts: 949


Re: Sum of Matrices
« Reply #2 on: Sep 9^{th}, 2002, 12:19pm » 
Quote Modify

Pietro: Sorry about using a very different notation! I have a solution which doesn't work. But I'm not sure if it's a useful nonworking solution, or a (more common) useless nonworking solution. REPLACEMENT VECTORS We are going to choose at most n of the m matrices to make up a rankn sum. We will do this by examining the given invertible sum rowby row, and choosing matrices to maintain the essential linear independence of rows. Call the m matrices A_{1} .. A_{m}. Let sum_{i = 1..m}A_{i} = M. We are going to be generating a matrix S, which is the sum of B_{1} .. B_{r}, where r <= n and all the B's are chosen from the A's. 1) Look at the rows of M, m_{i}. Start with the first row. Now we know that M is invertible, and thus all rows of M are linearly independent. Let's find the part of m_{1} which is orthogonal to all m_{2} .. m_{n}, and call this part mhat_{1}. 2) Now let's examine the A matrices and find out which ones contribute to this mhat_{1}. All A matrices which have a component in the direction of mhat_{1} in their first rows are candidates. Pick any one of these, and call it B_{1}it will become part of the S matrix. Call the first row of this matrix t_{1}. 3) Next, we will do the same for row two, except that we generate mhat_{2} to be orthogonal to t_{1} and m_{3} .. m_{n}. Doing so, we generate t_{2}, and decide on our second matrix, B_{2}, to include in the sum. Note that if B_{2} = B_{1}, there is no problem, as it is already included in the solution. 4) Repeat for every row, generating mhat_{i} so that it is orthogonal to t_{1} .. t_{i1} and m_{i+1} .. m_{n}. 5) Now we have a series of vectors t_{1} .. t_{n} which are linearly independent, one in each of the n rows, and we have the set of n matrices, B_{1} .. B_{n}, which we collected to form S. 6) The only problem is that when we form the sum S = sum_{i = 1..r}B_{i}, we cannot guarantee that S has linearly independent rows. There are two problems First, the row S_{i} is not guaranteed to have any component in the direction mhat_{i}. Even though we introduce a component in direction mhat_{i} by the inclusion of B_{i} in the sum S, it could be annihilated by the addition of B_{i+1} .. B_{r}. I believe that this deficiency can be overcome through judicious choices in step 2, but there is a further problem: Second, even if all the rows S_{1} .. S_{n} all have the required components in the directions mhat_{1} .. mhat_{n} respectively, they still might not be linearly independent. For example, for n=3, we could have: mhat_{1} = [1,0,0] mhat_{2} = [0,1,0] mhat_{3} = [0,0,1] S_{1} = [1,1,0] S_{2} = [1,1,0] S_{3} = [0,0,1] In this case, the S matrix is not invertible, because even though S_{1} points partly in direction mhat_{1}, it still has a component in direction mhat_{2} sufficient to make it the same as S_{2}. So there's my nonworking solution. Maybe that will give somebody an idea?


IP Logged 
Doc, I'm addicted to advice! What should I do?



Pietro K.C.
Full Member
Gender:
Posts: 213


Re: Sum of Matrices
« Reply #3 on: Sep 9^{th}, 2002, 11:00pm » 
Quote Modify

VERY beautiful idea about the mhats!!! Too bad it's 3am and I just saw your post... got class at 8am, this'll have to wait a bit. (13h50)I'm back. I thought today of the following theorem, which, once stated, is easily demonstrated by contradiction: if, to an nxn matrix of rank n, we add another of rank k, the rank of the resulting matrix cannot be less than n  k. I think it can be generalized, but I'm not in the mood to do it  the main idea seems to be captured in the above statement. What interests me now is the problem, which maybe the theorem will help solve. By the way, the mhats do seem promising, but I'm still not quite sure how to fix the proof.

« Last Edit: Sep 10^{th}, 2002, 9:57am by Pietro K.C. » 
IP Logged 
"I always wondered about the meaning of life. So I looked it up in the dictionary under 'L' and there it was  the meaning of life. It was not what I expected." (Dogbert)



James Fingas
Uberpuzzler
Gender:
Posts: 949


Re: Sum of Matrices
« Reply #4 on: Sep 11^{th}, 2002, 10:20am » 
Quote Modify

I've come up with another proof that doesn't work! Please, hold your applause ... Induction in rank: 1) We will generalize the problem to say: for any m nxn matrices that sum to a rank p matrix, it is possible to find a subset of size <= p of them which sums to a matrix of rank >= p. (Although this proof doesn't work, this statement may still be true) 2) For p=1, it is easy to show. Given m nxn matrices that sum to a rank1 matrix, at least one of them must have nonzero rank. Take any such matrix, and we have solved the problem. 3) Now for the inductive step. Given m matrices that sum to a rank p+1 matrix, we must show that it is possible to find a subset of them, of size <= p+1, which sum to a matrix of rank >= p+1. 4) A little exploration of the problem: we know that various combinations of the m matrices will sum to have different ranks. Now let us make up all possible sums (there are 2^{m} of them), and list them in order of decreasing rank. 5) Now we will take the sum with the largest rank less than or equal to p. The rank of this sum is r, and 'A' is the set of matrices used to make the sum. Now, by the inductive step, we can find a subset of A (call the subset B) such that the B matrices sum to have rank >= r, with B <= r. 6) Since the rank of the sum of B is >= r, then there are two possibilities: the sum has rank r, or the sum has rank > p. Note that it is not possible for this sum to have a rank between r and p, because we know r is the largest possible rank less than p for the sum of any subset of the original m matrices. If the rank is > p, then the inductive step holds. 7) If the rank is equal to r, then consider the addition of one more matrix to the set B, to make a set B'. The sum of B' will either be <=r, or will be >p. If any possible set B' has a sum >p, then the inductive step holds. If all possible B' sum to rank <=r, then surely that would violate our assumption that all m matrices sum to have rank p+1. Or so I thought! However, sometimes all possible B' have sums with rank <=r! m = 3, p = 2 1 0 0 0 0 0 0 0 0 1 0 0 0 1 0 0 0 0 1 0 0 0 0 1 0 0 0 here's the sum: 1 0 0 0 1 1 0 0 0 We must find a subset of size <= 2 that sum has rank >= 2. So we list all the subsets and choose one with rank 1. The subset we choose is just the first matrix. The set B is equal to the set A with no additional work. However, when we make B', we find that neither matrix 2 nor matrix 3 can add any rank to the set B. Now this example is trivial, but it disproves my proof Pietro, I think what you're saying about the ranks is true. I think that in general: rank(A)rank(B) <= rank(A+B) <= rank(A) + rank(B)


IP Logged 
Doc, I'm addicted to advice! What should I do?



James Fingas
Uberpuzzler
Gender:
Posts: 949


Re: Sum of Matrices
« Reply #5 on: Sep 11^{th}, 2002, 10:26am » 
Quote Modify

I just had a cool thought: EQUIVALENT PROBLEM: Suppose a set of m real nbyn matrices sums to the nxn identity matrix. Show that some subset of at most n of these also has a nonsingular sum. All I did was multiply all the matrices by the inverse of their sum! Maybe this problem is easier... ADDED: Here's another thing: The sum of the matrices that you don't use doesn't have 1 as an eigenvalue: (I  matrices you don't use)v != 0 (for nonzero v) This is the eigenvalue equation, with lambda = 1.

« Last Edit: Sep 11^{th}, 2002, 12:36pm by James Fingas » 
IP Logged 
Doc, I'm addicted to advice! What should I do?



ecoist
Senior Riddler
Gender:
Posts: 405


Re: Sum of Matrices (9/3/2002)
« Reply #6 on: Jan 19^{th}, 2007, 3:27pm » 
Quote Modify

Here's a proof. Let A_{i}, i=1,...,m be m nxn matrices and assume S=A_{1}+...+A_{m} is nonsingular. Form the n by nxm block matrix B=[A_{1} ... A_{m}]. If B has rank less than n, then there is a nonzero 1 by n vector v^{t} such that v^{t}B=0. This implies that also v^{t}S=0, whence S is singular, contrary to our assumption. Thus B has rank n. Hence there are n columns of B which are linearly independent. These columns are contained in at most n of the A_{i}. The corresponding block matrix B', consisting of those at most n A_{i}, thus has rank n; and so the corresponding sum, S', of these matrices is nonsingular. A similar argument shows: If a sum of m nxn matrices has rank k, then there exist at most k of the summands whose sum has rank k.

« Last Edit: Jan 20^{th}, 2007, 6:18am by ecoist » 
IP Logged 



Eigenray
wu::riddles Moderator Uberpuzzler
Gender:
Posts: 1948


Re: Sum of Matrices (9/3/2002)
« Reply #7 on: Jan 20^{th}, 2007, 11:44am » 
Quote Modify

on Jan 19^{th}, 2007, 3:27pm, ecoist wrote:The corresponding block matrix B', consisting of those at most n A_{i}, thus has rank n; and so the corresponding sum, S', of these matrices is nonsingular. 
 This doesn't follow. For example, if B=[1 0  1 0  1 0 ] [0 0  0 1  0 1 ], then most choices of 2 matrices don't work. Quote:If a sum of m nxn matrices has rank k, then there exist at most k of the summands whose sum has rank k. 
 This is equivalent to the given problem, since if the sum has rank >= k, then we may apply the given problem to a nonsingular kxk minor of the sum, and obtain <= k matrices whose sum has rank >= k.


IP Logged 



ecoist
Senior Riddler
Gender:
Posts: 405


Re: Sum of Matrices (9/3/2002)
« Reply #8 on: Jan 21^{st}, 2007, 10:47am » 
Quote Modify

Ha, ha! Je suis un cretin! At least the others knew they had an incomplete, or incorrect, proof! Sigh. Not many arrows left in my quiver.


IP Logged 



Eigenray
wu::riddles Moderator Uberpuzzler
Gender:
Posts: 1948


Re: Sum of Matrices (9/3/2002)
« Reply #9 on: Jan 24^{th}, 2007, 10:01am » 
Quote Modify

This problem has been bothering me the last few days. Suppose m>n, and let A_{1},...,A_{m} be given nxn matrices. For a subset S of [m]={1,...,m}, define the polynomial (in n^{2}m variables) f_{S} = det(sum_{{i in S}} A_{i}). The claim is then that if, at some point, f_{S} vanishes for all proper subsets S < [m], then f_{[m]} vanishes as well. By the Nullstellensatz, this is equivalent to the statement that some power of f_{[m]} lies in the ideal generated by {f_{S} : S < [m]}. In fact, we have [sum]_{S} (1)^{S} f_{S} = 0! I've never seen this remarkable identity before. For example, it says that for 2x2 matrices, det(A+B+C) = det(A+B) + det(A+C) + det(B+C)  det(A)  det(B)  det(C), or for 3x3 matrices, that det(A+B+C+D) + det(A+B) + det(A+C) + det(A+D) + det(B+C) + det(B+D) + det(C+D) = det(A+B+C) + det(A+B+D) + det(A+C+D) + det(B+C+D) + det(A) + det(B) + det(C) + det(D), an equality of cubics in 36 variables with 384 terms. Expanding out the formula for the determinant, we need to show that [sum]_{S} (1)^{S} [sum]_{p} sign(p) [prod]_{i} (sum_{{j in S}} A_{j}(i,p_{i})) = 0. To do this, fix a permutation p in S_{n}, and let X_{i,j} = A_{j}(i,p_{i}). By interchanging the order of summation, it suffices to show that [sum]_{S} (1)^{S} [prod]_{i} (sum_{{j in S}} X_{i,j}) = 0. The LHS above is a sum of monomials of the form X(J) = X_{1,j(1)}*X_{2,j(2)}...X_{n,j(n)}. If T = {j_{1},...,j_{n}}, then X(J) occurs once with coefficient (1)^{S} for each choice of S with T < S < [m]. Since T<n<m, the total coefficient is thus [sum]_{r=0}^{mT} (1)^{T+r} C(mT,r) = (1)^{T}(11)^{mT} = 0, and we are done.


IP Logged 



Obob
Senior Riddler
Gender:
Posts: 489


Re: Sum of Matrices (9/3/2002)
« Reply #10 on: Jan 24^{th}, 2007, 4:44pm » 
Quote Modify

Nice proof, Eigenray! I just noticed one technicality, which is that we had better be proving the result for complex matrices in order to use the Nullstellensatz. We can then specialize to the real case, if we so desire. We might as well then state the result for arbitrary field entries, and then the proof just relies on passing to the algebraic closure of the field. That really is a nice determinant identity. At least at a superficial level, it looks identical to the inclusionexclusion principle.

« Last Edit: Jan 24^{th}, 2007, 4:45pm by Obob » 
IP Logged 



Eigenray
wu::riddles Moderator Uberpuzzler
Gender:
Posts: 1948


Re: Sum of Matrices (9/3/2002)
« Reply #11 on: Jan 25^{th}, 2007, 4:08am » 
Quote Modify

The Nullstellensatz isn't necessary to give the proof; I only mentioned it by means of motivation. The problem resisted any linear algebra I could throw at it. It was only when I had the idea to think algebrogeometrically  since solving the problem is equivalent to some polynomial identities in the determinants of various subset sums (which isn't otherwise obvious!)  that I decided to try to find such an identity. After discovering the n=2 case (with Groebner bases using Mathematica ), the generalization is clear (but still surprising). I did leave out a step though: to be precise, what I really showed is that if all subsets of <k matrices have singular sum, where k>n, then all subsets of k+1 matrices have singular sum. Taking k=n,n+1,...,m1 then gives the result. As for the identity, I believe the n=1 case is equivalent to inclusionexclusion. In fact, there's probably a slick way to prove it inducting on n using inclusionexclusion and/or expansion by minors of something.

« Last Edit: Jan 25^{th}, 2007, 4:34am by Eigenray » 
IP Logged 



