|
||||||||||||||||||||||||||||||
Title: Some abstract algebra Post by malchar on Feb 12th, 2012, 10:19pm I just finished my abstract algebra homework, and I thought the problems were kind of interesting. They're not too hard either. The spoilers contain some definitions/hints. (To prove that this isn't a "do my homework" thread, I will post the full solutions in a few days after I get them checked.) 1. Prove that any finite integral domain must be a field. [hide]We define an integral domain to be a system with all the properties of a field except that it does not necessarily have multiplicative inverses. However, it does have the cancellation property: a=/=0 and (a*b=a*c or b*a=c*a) implies b=c, with a,b,c elements of the integral domain. So, it's sufficient to show that if an integral domain is finite, then it must have multiplicative inverses.[/hide] 2. Prove that in an associative ring, if each element is equal to its square, then multiplication must be commutative. [hide]We define an associative ring to have all the properties of a field except that multiplication does not necessarily have an identity, have inverses, or commute.[/hide] |
||||||||||||||||||||||||||||||
Title: Re: Some abstract algebra Post by Michael Dagg on Feb 17th, 2012, 10:49pm Both of your problems are generally fundamental theorems in really most all modern algebra books. So, finding solutions would be quite easy. Your #1 is better stated as saying that "every" finite integral domain is a field. Note of course that in conjunction with that statement is another one that says that every field is an integral domain. I won't spoil your ponder on #2 but you might find it to make complete sense that if for all a \in D where a = a*a then the operation * is commutative. After all, all a \in D is, well, all a and then of course all of D . |
||||||||||||||||||||||||||||||
Title: Re: Some abstract algebra Post by Michael Dagg on Mar 9th, 2012, 6:02pm Hint: [hideb]a = a*a for all a \in D then it looks like there is an some sort of special identity relation taking place. [/hideb] |
||||||||||||||||||||||||||||||
Title: Re: Some abstract algebra Post by malchar on Mar 19th, 2012, 1:20pm Solutions: 1. [hide]Let R be any finite integral domain. Define a function f(a) = a*x, where x=/=0 and x is an element of R. The function is a bijection from R to R: Assume f(a) = f(b), then a*x = b*x a=b by cancellation. Therefore f is injective. Since R is finite, f must also be bijective. Since f is bijective, it has an inverse g. g(a) = a * x-inverse, where x=/=0 and x is an element of R. Then x-inverse exists for any nonzero x in R, so R is a field.[/hide] 2. [hide]I actually got this one wrong because I indirectly assumed that cancellation would work in general, which it doesn't. Anyway, the correct proof can be found on Wikipedia by searching for Boolean Rings and looking at the "Properties of Boolean Rings" section.[/hide] A new problem (which I have already solved): 3. Any number that ends with the digits "13" is an "unlucky number". Any number that ends with the digits "7,777,777" is a "lucky number". For example: 87,867,564,513 is unlucky, and 12,347,777,777 is lucky. Show that every unlucky number must have a positive integer multiple which is a lucky number without using brute force to calculate what the multiple is. Hint: [hide]Use Bezout's Identity.[/hide] |
||||||||||||||||||||||||||||||
Title: Re: Some abstract algebra Post by Grimbal on Mar 26th, 2012, 5:34am [hide]Since unlucky numbers are odd and not a multiple of 5, you can compute the inverse mod 10000000 of the number and multiply by 7777777.[/hide] |
||||||||||||||||||||||||||||||
Title: Re: Some abstract algebra Post by malchar on Mar 28th, 2012, 7:43pm That looks about right. Here's the only other problem that I have for now. Complete the following multiplication table (fill in the blanks) so that the calculational system so defined is closed and associative:
Note that it's not commutative. There may be a few different methods to get a solution rather than just "guess and check". |
||||||||||||||||||||||||||||||
Title: Re: Some abstract algebra Post by pex on Mar 29th, 2012, 4:29am [hideb]We have cb = d, so to get associativity we must have dx = (cb)x = c(bx) for all x: da = cb = d db = ca = c dc = cc = c dd = cd = d[/hideb] |
||||||||||||||||||||||||||||||
Title: Re: Some abstract algebra Post by malchar on Mar 30th, 2012, 10:24am Pretty nice one there. That method is even more streamlined than the one that I used. |
||||||||||||||||||||||||||||||
Title: Re: Some abstract algebra Post by pex on Mar 31st, 2012, 12:38am on 03/30/12 at 10:24:12, malchar wrote:
Thanks. I saw one d outside the 'd' column, so I just figured I'd try to use it. How did you do it? |
||||||||||||||||||||||||||||||
Title: Re: Some abstract algebra Post by malchar on Apr 2nd, 2012, 12:02pm on 03/31/12 at 00:38:46, pex wrote:
I used [hide]Light's Associativity Test, which I discovered while browsing Wikipedia. {b, c} is considered to be the generating set for {a, b, c, d}, since every letter can be created using only b and/or c (a=bb, d=cb). Therefore, it is sufficient to show that b and c are associative with everything else. I create functions: f1(x,y)=x*(b*y) f2(x,y)=(x*b)*y g1(x,y)=x*(c*y) g2(x,y)=(x*c)*y If you put in each of {a, b, c, d} for {x, y}, you get a table for each function. If the tables of f1 and f2 are equal, then everything is associative with b. Similarly for g1, g2, and c. The difficulty is that now I am left to guess and check at different values for the original multiplication table. This method only serves to test whether or not the guesses are in fact solutions. You also end up doing a lot of extra work double-checking whether or not the existing portion of the table is associative, which is unnecessary.[/hide] |
||||||||||||||||||||||||||||||
Title: Re: Some abstract algebra Post by pex on Apr 2nd, 2012, 12:13pm Ah. Nice. And while it does lead to some extra work, it also gives you something that my method won't: a confirmation that the resulting system is, in fact, associative. (Of course, I did check before posting - but still...) |
||||||||||||||||||||||||||||||
Title: Re: Some abstract algebra Post by Michael Dagg on Apr 17th, 2012, 7:18pm So for (2) how many rings do you think there are for which all elements are their own squares? |
||||||||||||||||||||||||||||||
Title: Re: Some abstract algebra Post by 0.999... on Apr 29th, 2012, 7:45am In reference to counting the number of associative rings D which satisfy x http://www.ocf.berkeley.edu/~wwu/YaBBImages/symbols/in.gif D http://www.ocf.berkeley.edu/~wwu/YaBBImages/symbols/bigto.gif x*x = x: [hideb]First we prove an identity on x http://www.ocf.berkeley.edu/~wwu/YaBBImages/symbols/in.gif D, that x+x=0. For, x+x = (x+x)(x+x) = x+x+x+x. [Add -(x+x) to both sides to get the identity.] It follows that x + y = 0 if and only if x = y. [Since x+y = x+x, add -x to both sides.] x+y = (x+y)(x+y) = x+y + xy+yx, so that xy = yx. Define x http://www.ocf.berkeley.edu/~wwu/YaBBImages/symbols/le.gif y iff x+xy=0. x http://www.ocf.berkeley.edu/~wwu/YaBBImages/symbols/le.gif y and y http://www.ocf.berkeley.edu/~wwu/YaBBImages/symbols/le.gif z http://www.ocf.berkeley.edu/~wwu/YaBBImages/symbols/bigto.gif x http://www.ocf.berkeley.edu/~wwu/YaBBImages/symbols/le.gif z; because 0 = x+xy = x+xyz = x+xz. x http://www.ocf.berkeley.edu/~wwu/YaBBImages/symbols/le.gif y and y http://www.ocf.berkeley.edu/~wwu/YaBBImages/symbols/le.gif x http://www.ocf.berkeley.edu/~wwu/YaBBImages/symbols/bigto.gif x = y; because 0 = x+xy = x+y. If ax = 0, then x http://www.ocf.berkeley.edu/~wwu/YaBBImages/symbols/le.gif a+x, so define Z(x) = {ahttp://www.ocf.berkeley.edu/~wwu/YaBBImages/symbols/in.gifD: ax=0} = {y+x: yhttp://www.ocf.berkeley.edu/~wwu/YaBBImages/symbols/in.gifD and x http://www.ocf.berkeley.edu/~wwu/YaBBImages/symbols/le.gif y}. x http://www.ocf.berkeley.edu/~wwu/YaBBImages/symbols/le.gif y iff Z(x) http://www.ocf.berkeley.edu/~wwu/YaBBImages/symbols/supset.gif Z(y). For every nonidentity x http://www.ocf.berkeley.edu/~wwu/YaBBImages/symbols/in.gif D, there exists y http://www.ocf.berkeley.edu/~wwu/YaBBImages/symbols/in.gif D such that x < y: Let z be such that z+xz http://www.ocf.berkeley.edu/~wwu/YaBBImages/symbols/ne.gif 0. Then 0 = xz+xz = xz+xxz = x(z+xz). Let y = x+z+xz. If for some nonidentity x,y http://www.ocf.berkeley.edu/~wwu/YaBBImages/symbols/in.gif D, Z(x) http://www.ocf.berkeley.edu/~wwu/YaBBImages/symbols/cap.gif Z(y) = {0}, then x+y is an identity. For, b(x+y) = 0 iff bx = by in which case we have bx http://www.ocf.berkeley.edu/~wwu/YaBBImages/symbols/le.gif a implies b http://www.ocf.berkeley.edu/~wwu/YaBBImages/symbols/le.gif a so that b http://www.ocf.berkeley.edu/~wwu/YaBBImages/symbols/le.gif bx and likewise b http://www.ocf.berkeley.edu/~wwu/YaBBImages/symbols/le.gif by. Then b = bx = by, but Z(bx) = Z(b) http://www.ocf.berkeley.edu/~wwu/YaBBImages/symbols/cup.gifZ(x) and Z(by) = Z(b) http://www.ocf.berkeley.edu/~wwu/YaBBImages/symbols/cup.gifZ(y), so Z(x) = Z(y) = {0}, a contradiction. Therefore, there exists no b such that b(x+y) = 0, so x+y = 1. On the other hand, if http://www.ocf.berkeley.edu/~wwu/YaBBImages/symbols/bigcap.gif{Z(x): x http://www.ocf.berkeley.edu/~wwu/YaBBImages/symbols/in.gif D} contains a http://www.ocf.berkeley.edu/~wwu/YaBBImages/symbols/ne.gif 0, then ax = 0 for all x http://www.ocf.berkeley.edu/~wwu/YaBBImages/symbols/in.gif D, but 0 = 0a = aaa = a. Therefore, 1 http://www.ocf.berkeley.edu/~wwu/YaBBImages/symbols/in.gif D. I (now) claim that for any partial ordering P = (X,http://www.ocf.berkeley.edu/~wwu/YaBBImages/symbols/le.gif) such that infX and supX exist (label them 0 and 1) and for all x http://www.ocf.berkeley.edu/~wwu/YaBBImages/symbols/in.gif X, 0 http://www.ocf.berkeley.edu/~wwu/YaBBImages/symbols/le.gif x http://www.ocf.berkeley.edu/~wwu/YaBBImages/symbols/le.gif 1, there is a ring D such that every element is idempotent which has the same underlying set as X and when http://www.ocf.berkeley.edu/~wwu/YaBBImages/symbols/le.gif is defined on D as above, there is an isomorphism between P and D. For x http://www.ocf.berkeley.edu/~wwu/YaBBImages/symbols/in.gif X, define Z(x) = {y http://www.ocf.berkeley.edu/~wwu/YaBBImages/symbols/in.gif X: inf{x,y} = 0}. Then X-Z(x) = {y http://www.ocf.berkeley.edu/~wwu/YaBBImages/symbols/in.gif X: 0 < inf{x,y} http://www.ocf.berkeley.edu/~wwu/YaBBImages/symbols/le.gifx}, so x = sup{inf{x,y}:y http://www.ocf.berkeley.edu/~wwu/YaBBImages/symbols/in.gif X-Z(x)}. Then, we can define multiplication and addition: For x,y http://www.ocf.berkeley.edu/~wwu/YaBBImages/symbols/in.gif X, let xy = inf{x,y} and x+y = z where Z(z) = Z(x)http://www.ocf.berkeley.edu/~wwu/YaBBImages/symbols/cap.gifZ(y). [/hideb] Edit: Corrected major oversight. Luckily the result is still intact. Edit2: I have to be running out of luck soon. Edit3: There's a more sensible result; perhaps that is correct. :-[ |
||||||||||||||||||||||||||||||
Powered by YaBB 1 Gold - SP 1.4! Forum software copyright © 2000-2004 Yet another Bulletin Board |