Author 
Topic: Lecture 01 (8/26/2002) (Read 1803 times) 

william wu
wu::riddles Administrator
Gender:
Posts: 1291


Lecture 01 (8/26/2002)
« on: Aug 27^{th}, 2002, 1:53am » 
Quote Modify

8/26/2002, Paraphrased Math H90 Notes by William Wu. W. Kahan XXX Evans / XXX Soda XXXXXXX www.cs.berkeley.edu/~wkahan/MathH90 There are a lot more people here than I expected. I suspect that someone has been ... advertising. There is no standard grading scheme. Grades are based on your participation, on how good of a try you give. Many people in math classes are refugees from English classes. However, knowing the solution to a problem is useless if you can't express the solution such that another person can understand it. Notice that I speak very slowly and precisely. You must learn to articulate clearly. Now, It may be possible that all the problems I give you are so baffling that you won't be able to solve any of them. And then you'll look at the solutions. More on how to profit from that later. However, whenever you see a solution to a problem you could not solve, what is the most important question that you must ask yourself? "What did the solver know that I didn't know?" Once you can answer that, then you can study what you need to know. * * * Some problems have no solutions. Here's an example. Say you have two sets, one comprised of letter variables and the other of numbers: a, b, c, ... 0, 1, 2, ... and you can apply operators such as +, , x, /, ^, () to elements of these two sets, to make wellformed expressions. Then you construct a set of equations: (wellformed expression 1) = 0 (wellformed expression 2) = 0 (wellformed expression 3) = 0 ... The question is, does there exist an assignment of integers to the letter variables in these expressions, such that all the equations are satisfied? Can you write a program that determines whether such an assignment exists? These equations are called diophantine equations: equations whose solutions demand integer values. It has been proven that a program that can solve any set of diophantine equations can also solve the halting problem. But we know that the halting problem is unsolvable, from basic computer science courses. Thus, no such program which solves diophantine equations can exist. What if we relax the requirements a little bit? Let's say that now, the letter variables in our wellformed expressions can take on values in the rationals, rather than just the integers. The rationals are simply ratios of integers. Now, does there exist a program that can solve this? The answer is, we don't know. So a very small change in the problem has thrown us into the unknown. It's always possible that we are doomed to forever try solving problems that can't even be solved. What's an example of a problem that can't be solved? Well, I can't even give you an example. Why? Consider Goldbach's Conjecture: Every even integer > 4 is the sum of 2 odd primes. Currently we don't know the answer. All we can do is get a computer to look at every even integer > 4, and ask it "Are you the sum of 2 odd primes?" until the end of time. We could have the NSA put a supercomputer in a cave and compute these sums of odd primes forever. It would be a great contribution to humanity. Let's suppose the conjecture is false. Then what does that imply? It implies that there exists an even integer greater than 4 which is NOT the sum of 2 odd primes. Thus, if the conjecture is false, it's verifiably false, because we can present a counterexample. So: If conj. is false, it's verifiably false. 2K != p1 + p2 If conj. is true, it may be Provable (finite proof) or Unprovable (infinite proof) Note that an "infinite proof" is not a proof, because it's so long that there's no place for you to put the "QED". Now I claim that you can't even prove to me that this problem is unprovable. Why? (pause for a minute or two) Because if you prove that the conjecture is unprovable, then you've proved that the conjecture is true. * * * So some problems are impossible. Others are impractical. For instance, the 4 color theorem proof is impractical. The four color theorem asserts that any map can be colored using only four colors in such a way that adjacent regions (i.e. those sharing a common boundary segment, not just a point) receive different colors. The proof by Appel and Haken in 1976 used a computer to somehow enumerate all the possible 1,476 map configurations, and show that a 4coloring exists for each of these configurations. It is a terribly long proof that requires combing through reams of code. Here's another example of an impractical proof. Consider the following problem: you have a real equation "f(z) = 0" to be solved for a real z. [Draws a curvy graph y = f(x) on the board against an xy plane, and labels one of the zeroes as "z".] How do you solve the equation? (student responses) One way is to graph it, but that could take too long. Another idea is to apply bounds to portions of the graph. If the signs of the curve different at both bounds, you know a zero must exist in between the bounds. However, this method can fail to detect solutions, because the signs of the curve can be identical at both bounds if more than one zero exists in between the bounds. Another idea is Newton's method. [Demonstrates.] Another idea is the secant method. [Demonstrates.] Something very interesting is that no computer program can find a maximum for a general function f(x) within given bounds. Not even approximately! [I don't remember how this comment relates to this train of thought.] So here's a theorem. If Newton's method converges no matter where you start, then the secant method will also converge. However, the converse of htis statement is not always true. This is a very simple theorem. However, the proof is incredibly long. I wrote it, and it is so long, I'm embarrassed to publish it. You can see it here: http://www.cs.berkeley.edu/~wkahan/Math128/RealRoots.pdf In the 1960s, Michael Rabin and some other smart people discovered something rather dreadful. Take any sufficiently interesting theory, like number theory, or geometry. Order all of its theorems by length [how many characters required to write down the theorem], from least to greatest. Now try proving each of these theorems, from shortest to longest. So we start with some axioms and gradually work toward longer and longer theorems. And prove these theorems with the restriction that in each proof, you may only cite shorter theorems which you have already proved. The question is, how rapidly does the length of these proofs grow? The answer is exponentially. Meaning that the length grows very quickly. Meaning that most mathematical theorems are very long. So when you take math classes, you are only exposed to relatively simple and short proofs. You do not learn long proofs, which comprise the majority of proofs in mathematics. [Most problems are "impractical".] * * * There is no textbook for this course. But here are books you might want to check out. Here is a book by Paul Zeitz called "The Art and Craft of Problem Solving" (1999). He tries to each people how to solve problems. It's a pretty good book as these books come. However, I can't really recommend it. He tries to teach problem solving by showing how to solve lots of problems, and also giving some strategic hints for problem solving in general. However, in my experience, I have found that books like these don't help that much. People who are already good at problem solving will read the hints and just find them obvious. But people who are bad at problem solving will just memorize those hints. So all they can do is recite these strategies and hints. And yet, when a different kind of problem appears to which the hint may even be applicable, they often fail to recognize that the hint is applicable. I went into mathematics because I thought I had a bad memory. I couldn't remember anything. At times I thought my mind was like swiss cheese. So I thought it would be better to go into a field where you just have to memorize a few things and you could figure everything else out yourself, instead of going into a field like biology, where there's just no hope at all. Well, I was wrong. It turns out that my memory was better than 99% of the population. And even in my old age now, my memory is still better than about 90% of the population. And memorizing solutions actually isn't a bad way to solve problems. When your professors give you problems, or when the administrators of the Putnam exam are deciding what problems to put on there, they all know that these problems have solutions, unless someone has made a terrible mistake along the way. The best way to learn problemsolving is to do problems. You can either submit your proofs to me, or proofread them amongst yourselves for faster response. This is important because it allows you to see different ways of thinking. I'm a continuum man. I like to think of things as plastic, stretchable structures that I can contort and twist. I like differential equations and approximations. However, other people like to think in terms of arrangements. They like to count all the possibilities, arrange things in different ways. I'm not very good at thinking like that, because I always tend to forget some cases or count something twice. Yet that doesn't mean that I don't try to think in a combinatorial way. Every field is now moving from continuum math to combinatorial (discrete) math. I was responsible for sicking the math department onto Rosen's Discrete Math book when it was in its 2nd edition. Now it's in its 4th edition. And we know have strong math courses in combinatorics. Lower division and upper division. Other books recommended: Polya & Szego: Problems and Theorems in Analysis. Covers classical math curriculum (104, 110, 115, 185 ...). Great way to teach yourself math, IF you can teach yourself math. Most people just get discouraged when they have to look in the back of the book too much, or when they see a solution and it's just too brief for them, and they spend a lot of time trying to understand why that actually is the solution. Hadwiger, Debrunerd & Klee: Combinatorial Geometry in the Plane. There are no classes at Berkeley on this. But it sure is interesting. Kazarinoff: Geometric Inequalities. Kazarinoff: Analytic Inequalities. Inequalities are difficult for most mathematicians, and very hard for computers. In my 48 years of teaching, all the students who were good at inequalities were never unemployed, except for one who was unemployed for about 2 weeks. He was the kind of guy who didn't like people telling him what to do and needed a special kind of boss. Anyways, if you are good at inequalities, your employment is secure. I only brought enough books that I could fit in one hand. I could have easily brought 100 books. If you really want a textbook for this class, just get any interesting math book  it could be a textbook  and make sure it has interesting problems. You'll have to sample them a bit, because you don't want a book of drill problems. Most linear algebra books are just drill problems. Get a book with interesting problems, like Rosen's Discrete Mathematics. * * * Americans are sloppy in their language use. Most Americans can't tell the difference between these sentences: Only I like peanut butter. I only like peanut butter. I like only peanut butter. How many meanings do you see? [There are three. Some students only saw two at first.] Grammar used to be taught as an analytical subject. But it is no longer taught that way anymore, and that's unfortunate. English has tremendous advantages over other languages for expressing technological things precisely. We have tenses. We have a pluperfect to ... [something something something] ... to differentiate between what happened, and what's happening now, and what will happen [etc.]. And it's all logical. Chinese is not a good language for precise expression. Their tenses are often ambiguous. Russian is an even better language than English, because it has even more tenses. So how can you learn to express yourself precisely? Well, you can practice by doing math proofs. But this admittedly can be very frustrating and tiresome. An alternative is to read good English literature. I liked Winston Churchill. Another is Len Deighton, who writes spy novels. His diction is perfect. I talk this way to provide a model for you to express yourself precisely. However, my language is very archaic. I'm 113 years outdated. If you were to talk like this in the street, people would ask you why you were so stilted. People in the South think I'm a really nasty fellow, because I can't imitate a drawl. I used to be able to imitate a Souther drawl. Actually I still can, but I don't do that anymore, because it makes me sound too much like the President. And that's something I cannot bear. * * * Problems: 1. A column vector x is distributed randomly and uniformly over the surface of the unit sphere in an Ndimensional Euclidean space. What is the expected (mean) value of the squared length of x's projection onto a Kdimensional plane through the sphere's center? 2. The complex Principal Square Root sqrt(z) of a complex number z is the square root (one of two if z != 0) whose Real Part is same sign to Im(sqrt(z)) as to Im(z). (Yes, zero has a sign.) Now let f(z) := sqrt(1  z^2), g(z) := sqrt(1z) * sqrt(1+z) F(z) := sqrt(z^2  1), G(z) := sqrt(z1) * sqrt(z+1) Over what region in the complex plane does f(z) = g(z)? Over what region in the complex plane does F(z) = G(z)?

« Last Edit: Sep 17^{th}, 2002, 7:20am by william wu » 
IP Logged 
[ wu ] : http://wuriddles.com / http://forums.wuriddles.com



