# Trace is the derivative of determinant

A question I always had when learning linear algebra is, “what does the trace of a matrix mean conceptually?” For example, the determinant of a matrix is, roughly speaking, the factor by which the matrix expands the volume. The conceptual meaning of trace is not as straightforward, but one way to think about it is

trace is the derivative of determinant at the identity.

Roughly you can think of this in the following way. If you start at the identity matrix and move a tiny step in the direction of , say where is a tiny number, then the determinant changes approximately by times . In other words, . Here stands for the identity matrix.

One can be very precise about what it means to take the “derivative” of the determinant, so let me do some setup. Let be either or (so we are working with real or complex Lie groups; but of course, everything makes sense for algebraic groups over arbitrary fields). Then there is a morphism of Lie groups called the determinant , given by sending a matrix to its determinant. Since we are restricting to invertible matrices, the determinants are nonzero. To check that this is really a morphism of Lie groups (i.e. both a smooth map and a homomorphism of groups), note that the determinant map is a polynomial map in the entries of the matrix (and therefore smooth) and is a group homomorphism by the property that .

Now, given any smooth map of manifolds which maps point , there is an induced linear map on from the tangent space of to the tangent space of called the derivative of at . In particular, if is a Lie group homomorphism, then it maps the identity point to the identity point, and the derivative at the identity is furthermore a homomorphism of Lie algebras. What this means is that, in addition to being a linear map, it preserves the bracket pairing.

In the case of , the Lie algebra at the identity matrix is called . We can think of it as consisting of all matrices, and the bracket operation is defined by . The Lie algebra of at consists of the elements of ; since is abelian, the bracket is trivial.

The main claim, which I will prove subsequently, is that this map , the derivative of the determinant at the identity, is actually the trace. That is, it sends a matrix to its trace, the sum of the entries on the diagonal. Note that since it is a homomorphism of Lie algebras, it preserves the bracket, and we recover the familiar property of trace , so .

We can find the derivative of a smooth map on directly, since it is an open subset of a vector space. Let be a matrix; then the derivative at the identity evaluated at is

is a polynomial in , and the number we’re looking for is the coefficient of the term.

We have

Just to get a concrete idea of what this expands to, let’s look when . Then

When ,

In particular, the coefficient of is . (In fact, see if you can convince yourself that the coefficient of is .)

See some discussion of the meaning of trace.

Acknowledgements: Thanks to Ben Wormleighton for originally telling me the slogan “trace is the derivative of determinant”, and for teaching me about Lie groups and Lie algebras.

To add: discussion of Jacobi’s formula, exponential map

# Determinant of transpose

An important fact in linear algebra is that, given a matrix , , where is the transpose of . Here I will prove this statement via explciit computation, and I will try to do this as cleanly as possible. We may define the determinant of by

Here is the set of permutations of the set , and is the sign of the permutation . This formula is derived from the definition of the determinant via exterior algebra. One can check by hand that this gives the familiar expressions for the determinant when .

Now, since , we have

The crucial observation here is that we may rearrange the product inside the summation so that the second indices are increasing. Let . Then the product inside the summation is

Combining this with the fact that , our expression simplifies to

Noticing that the sum is the same sum if we replace all s with s, we see that this equals . So .

I wonder if there is a more conceptual proof of this? (By “conceptual”, I mean a proof based on exterior algebra, bilinear pairings, etc…)

# More on algebraic numbers

A complex number is algebraic if it is the root of some polynomial with rational coefficients. is algebraic (e.g. the polynomial ); is algebraic (e.g. the polynomial ); and are not. (A complex number that is not algebraic is called transcendental)

Previously, I wrote some blog posts (see here and here) which sketched a proof of the fact that the sum and product of algebraic numbers is also algebraic (and more). This is not an obvious fact, and to prove this requires some amount of field theory and linear algebra. Nevertheless, the ideas in the proof lead the way to a better understanding of the structure of the algebraic numbers and towards the theorems of Galois theory. In that post, I tried to introduce the minimum algebraic machinery necessary in order to state and prove the main result; I don’t think I entirely succeeded.

However, there is a more direct approach, one which also allows us find a polynomial that has (or ) as a root, for algebraic numbers and . That is the subject of this post. Instead of trying to formally prove the result, I will illustrate the approach for a specific example: showing is algebraic.

This post will assume familiarity with the characteristic polynomial of a matrix, and not much more. (In particular, none of the algebra from the previous posts)

## A case study

Define the set . We will think of this as a four-dimensional vector space, where the scalars are elements of , and the basis is . Every element can be uniquely expressed as , for .

We’re trying to prove is algebraic. Consider the linear transformation on defined as “multiply by “. In other words, consider the linear map which maps . This is definitely a linear map, since it satisfies and . In particular, we should be able to represent it by a matrix.

What is the matrix of ? Well, , , , and . Thus we can represent by the matrix

.

Now, the characteristic polynomial of this matrix, which is defined as , is , which has as a root. Thus is indeed algebraic.

## Why it works

The basic reason is the Cayley-Hamilton theorem. It tells us that should satisfy the characteristic polynomial: is the zero matrix. But the matrix we get when plugging into should correspond to multiplication by ; thus .

Note that I chose randomly. I could have chosen any element of and used this method to find a polynomial with rational coefficients having that element as a root.

At the end of the day, to prove that such a method always works requires the field theory we have glossed over: what is in general, why is it finite-dimensional, etc. This constructive method, which assumes the Cayley-Hamilton theorem, only replaces the non-constructive “linear dependence” argument in Proposition 4 of the original post.

# Two proofs complex matrices have eigenvalues

Today I will briefly discuss two proofs that every matrix over the complex numbers (or more generally, over an algebraically closed field) has an eigenvalue. Notice that this is equivalent to finding a complex number such that has nontrivial kernel. The first proof uses facts about “linear dependence” and the second uses determinants and the characteristic polynomial. The first proof is drawn from Axler’s textbook [1]; the second is the standard proof.

## Proof by linear dependence

Let be a polynomial with complex coefficients. If is a linear map, . We think of this as “ evaluated at ”.

Exercise: Show .

Proof: Pick a random vector . Consider the sequence of vectors This is a set of vectors, so they must be linearly dependent. Thus there exist constants such that .

Define . Then, we can factor By the Exercise, this implies . So, at least one of the maps has a nontrivial kernel, so has an eigenvalue.

## Proof by the characteristic polynomial

Proof: We want to show that there exists some such that has nontrivial kernel: in other words, that is singular. A matrix is singular if and only if its determinant is nonzero. So, let ; this is a polynomial in , called the characteristic polynomial of . Now, every polynomial has a complex root, say . This implies , so has an eigenvalue.

## Thoughts

To me, it seems like the determinant based proof is more straightforward, although it requires more machinery. Also, the determinant based proof is “constructive”, in that we can actually find all the eigenvalues by factoring the characteristic polynomial. On subject of determinant-based vs determinant-free approaches to linear algebra, see Axler’s article “Down With Determinants!” [3].

There is a similar situation for the problem of showing that the sum (or product) of two algebraic numbers is algebraic. Here there is a non-constructive proof using “linear dependence” (which I attempted to describe in a previous post) and a constructive proof using the characteristic polynomial (which will hopefully be the subject of a future blog post). A further advantage of the determinant-based proof is that it can be used more generally to show that the sum and product of integral elements over a ring are integral. In this more general context, we no longer have linear dependence available.

## References

1. Sheldon Axler, Linear algebra done right. Springer 2017
2. Evan Chen, An Infinitely Large Napkin, available online
3. Sheldon Axler. Down with Determinants! The American Mathematical Monthly, 102(2), 139, 1995. doi:10.2307/2975348, available online

# Algebro-geometric proof of Cayley-Hamilton

Here is a sketch of proof of the Cayley-Hamilton theorem via classical algebraic geometry.

The set of n x n matrices over an algebraically closed field can be identified with the affine space . Let be the subset of matrices that satisfy their own characteristic polynomial. We will prove that is in fact all of . Since affine space is irreducible, it suffices to show that is closed and contains a non-empty open set.

Fix a matrix . First, observe that the coefficients of the characteristic polynomial are polynomials in the entries in . In particular, the condition that a matrix satisfy its own characteristic polynomial amounts to a collection of polynomials in the entries of vanishing. This establishes that is closed.

Let be the set of matrices that have distinct eigenvalues. A matrix has distinct eigenvalues if and only if its characteristic polynomial has no double roots when it splits. This occurs if and only if the discriminant of the characteristic polynomial is nonzero. The discriminant is a polynomial in the coefficients of the characteristic polynomial. Thus the condition that a matrix have distinct eigenvalues amounts to a polynomial in the entries of not vanishing. Thus is open.

Finally, we have to show . It is easy to check this for a diagonal matrix. The general result follows from the fact that the determinant and thus the characteristic polynomial is basis-invariant.