

Title: Differentiation Disaster Post by D on Jul 25^{th}, 2002, 3:05pm Differentiation Disaster (http://www.ocf.berkeley.edu/~wwu/riddles/hard.shtml#differentiationDisaster) It's been way too long since I took Calculus. This problem is killing me. I think the problem is between converting x*x > x +x + x (xtimes). This method of evaluation would work for constants (eg: 3x > x + x + x). But I don't really know, anyone have ideas? D / link added and title cleaned up by moderator / 

Title: Re: HARD: Differentiation Post by guest on Jul 25^{th}, 2002, 4:39pm One way to look at it would be : From the first principle of differentiation, let delxsum=(x+delx)+(x+delx)+... repeated (x+delx) times let xsum = x+x+x.... repeated x times then we have d(x*x)/dx = lim delx > 0 (delxsumxsum)/delx In other words, the "x number of times" operator is a function of x and hence we can't just treat that as a constant while differentiating. We should differentiate this "number of times" operator also. 

Title: Re: HARD: Differentiation Post by David on Jul 25^{th}, 2002, 9:24pm Yeah  the trick is that there is no such thing as just saying x+x+x...(x many times)  this is a function thus the chain rule must be applied. The error would be much more apparent if you write out the summation in big sigma notation. 

Title: Re: HARD: Differentiation Post by Bojinov on Jul 26^{th}, 2002, 2:30pm The point you guys are missing is that the xes themselves don't matter when you do the differentiation. First of all, you can only say "x+x+...+x (x times)" if "x" is an integer. If it is not, let's say x=y+d, where y is an integer, and 0<d<1. Then x*x is "(y+d)*(y+d)" and if you write it out, you will see it is equal to "y+...+y (y times) +2dy+d^2". Now, the point is that when you differentiate, the "y+...+y" term disappears. So in this case, as with calculus in general, the small terms happen to be the ones that really matter. 

Title: Re: HARD: Differentiation Post by william wu on Jul 26^{th}, 2002, 6:15pm Another observation is that the x^{2} = x + x + x ... transformation fails if x < 0, even if x is an integer. But that's kind of minor; I would just say that the key observation is that x^{2} can only transformed into such a sum if x is a positive integer. While it is true that 4^{2} = 4 + 4 + 4 + 4 what if x = 2.34661? Then you write 2.34^{2} = 2.34 + 2.34 + uh oh (you can't write .34661 of a number) By applying this transformation, my function is no longer defined on a continuous domain, but on a discontinuous, discrete domain, with isolated points at the positive integer values. Think of the derivative operator as simply returning a slope: change in y / change in x. Now let's say that K, A, and B are points in the domain of some function f, where A<K<B. If you want to find the slope of f at K, you can approximate it by finding the slope between two points A and B. The closer A and B are together, the more accurate your approximation becomes. Eventually, when A and B are an infinitesimally small distance apart, you have the actual slope at K. Now, if my function is discrete, I can't evaluate such limits because no two points are an infinitesimally small distance apart from each other! All defined points are one integer unit apart. Bottom line: Differentiation is not defined for discontinuities. P.S. I stumped a mathematics Master's degree student with this riddle once. It was quite amusing to watch him suffer. His whole universe was temporarily shattered by four lines of pencil scribble. 0wn4g3! 8) 

Title: Re: HARD: Differentiation Post by NickH on Jul 27^{th}, 2002, 4:09am I don't think William's response quite gets to the heart of the matter. It's true, of course, that differentiation is not defined for a discontinuous function. Consider, though, what happens if we extend the additive notation to cover positive real x. For example, if x = 2.4, we write x + x + 0.4x. If x = 2.5, we write x + x + 0.5x, and so on. Now, having restored continuity, we can again pose the question: why is the derivative of this function at x = 2.4 not equal to 1 + 1 + 0.4? The reason, as guest says, is that we are ignoring the fact that the number of x's being added is also changing. To make this even clearer, consider that the above extension is equivalent to the following definition: f(x) at 2.4 is defined as 2.4x, at 2.5 it is defined as 2.5x, and so on. The fallacy lies in assuming that, when we calculate from first principles the derivative at x = 2.4, f(x + deltax) = 2.4(x + deltax). The correct formulation is f(x + deltax) = (2.4 + deltax)(x + deltax). Nick 

Title: Re: HARD: Differentiation Post by william wu on Jul 27^{th}, 2002, 4:33am I now see that I never truly understood why the proof was fallacious. Indeed, my post did not get to the heart of the matter at all ... it's funny, prior to reading this thread, I had thought I really understood the theory behind this riddle. But now I believe do. Many thanks! 

Title: Re: HARD: Differentiation Post by phil m on Aug 8^{th}, 2002, 10:08am x^{2} = sum_{(0 to x)} [ x]  sum_{(0 to 0)} [ x] = x*(sum_{(0 to x)} [1]  sum_{(0 to 0)} [1]) = x*(sum_{(0 to x)} [1]  1). since sum_{(0 to 0)} [z] = z. note that x*( sum_{(0 to x)} [1] 1) = x*sign(x) * (sum_{(0 to abs(x))} [1] 1) = abs(x) * (sum_{(0 to abs(x))} [1] 1) This takes care of x<0. also, if abs(x) is noninteger, and n is the max integer<abs(x), then sum_{(0 to abs(x))} [1] 1 = {sum_{(0 to n)} [1] + [abs(x)n]} 1 = abs(x), taking care of abs(x) is noninteger. so, x^{2} = x* (sum_{(0 to x)} [1] 1) d/dx(x * (sum_{(0 to x)} [1] 1)) = d/dx(x) *(sum_{(0 to x)} [1] 1) + x*(sum_{(0 to d/dx(x))} [1] 1) = (sum_{(0 to x)} [1] 1) +x*(sum_{(0 to 1)} [1] 1) = x + x*(21) =2x 

Title: Re: HARD: Differentiation Post by Mongolian_Beef on Aug 13^{th}, 2002, 10:43pm i think the problem is that x+x+x... x number of times cannot be equated to x^2. im not sure how you have a negative number of times but furthermore i think that the reason differentiation gets so screwy is that x number of times in itself is like a minifunction so you have to apply chain rule. and how would you take the derivative of something like that anyway? perhaps i just dont know enough calculus but the fallacy really appears to be (if not equating the functions)equating the derivative of two functions simply because the functions produce the same output. 

Title: Re: HARD: Differentiation Post by Ken Plochinski on Aug 16^{th}, 2002, 8:11am Here's an explanation which works if x is a positive number that is not an integer. If we use the notation: [x] = the integer part of x, and {x} = the fractional part of x (So e.g., [4.2]=4, and {4.2}=0.2) Then we can write x^{2} = x * ([x] + {x}) = (x + x + . . . + x) + x*{x} (where there are [x] terms in the first sum). Taking derivatives, using the product rule for the last term, and noting that the derivative of {x} is 1 (except where x is integral in which case the derivative does not exist), the derivative of the right hand side is: (1 + 1 + . . . + 1) + (1*{x} + x*1) (with [x] 1's in the first sum) which is [x] + {x} + x = x + x = 2x. It's interesting that "most" of the derivative is in that last fractional term. 

Title: Re: HARD: Differentiation Post by zarathustra on Aug 24^{th}, 2002, 8:58pm how about presenting the problem in an even simpler way d/dx[ x ] = 1 but... d/dx[ x ] = d/dx[ 1 + 1 + 1 + ... (x times) ] = d/dx[1] + d/dx[1] + d/dx[1] ... (x times) = 0 + 0 + 0 + ... (x times) = 0 

Title: Re: HARD: Differentiation Post by Mukul Joshi on Aug 28^{th}, 2002, 4:00am The problem is that "X times" is not independent of x. You say 5x = x + x + x + x + x It will work. But x times is not being accounted during differentiation. 

Title: Re: HARD: Differentiation Post by Mukul on Aug 29^{th}, 2002, 10:23pm Is x + x + x + ... ( x times ) continuous? ::) 

Title: Re: HARD: Differentiation Post by local on Nov 17^{th}, 2002, 3:59am hi my english is not pro, and i'm not the real expert at this things but i think i know whats the problem.. x is not accounted during deriving.. thats for sure.. but lets look problem more algebraicly (algebra?) derivation is linear transformation.. we all know derivating rules.. lambda  a scalar(number, skalar?) i will use the letter l for lambda, f  function, '  derivation  lin. transf. is defined by: (l*f)' = l*(f)' and (f+g)' = f' + g' and when it comes to derivating, the lambda (skalar) is understood as a constant, and we can move these constants out of the derivation.. d/dx[x] + d/dx[x] + d/dx[x] ... (x times) if we sum up these.. we get (x)* d/dx[x] ... so x is a constant..and u cannot move x inside .. to get d/dx[x^2] !! u change function.. u cannot move objects from different algebraic structures.. function is function, scalars are scalars.. (i dont know how u call it) the funtion x^2 is not the same as x times x (first x is a constant, and we can rename it to k).. so in 1 case u actualy derive x^2, in other case u derive kx, em. so when doing this.. d/dx [x^2] = x d/dx[x].. this is wrong.. (1st rule) u can only move out the constants of derivation, but x is not a constant.. because + doesnt have same em, features? as * (f + g)' = f' +g' but, (f*g)' =! f' * g' (=!  not equal), we all know (f*g)' = f(g)' + (f)'g d/dx[x^2] = d/dx[x*x] = x(x)' + (x)'x = x+x = 2x cu 

Title: Re: HARD: Differentiation Post by dogfriend_ltk on Jan 30^{th}, 2003, 2:06am I've the initial riddle and the answers above but still some questions have come to mind: Ok, so basically there are two problems : the function that takes x as a parameter and returns x + x + ... + x (x times) can only be defined on N (unsigned integers). But one can define the limit of a function in a point if and only if that point is a point of accumulation for the function's domain [x point of accumulation for D <=> (for any r>0, (xr , x+r)\{x} intersected with D != void)]. But the only accumulation point for N is infinity, so the problem ends. Furthermore, if we were to "bend" the function so that it would become continous, william's formula remains bogus, as "x times" isn't taken into account in differentiation. There is still a question worth asking. Let's define as "countable" a domain D with the property that one can define a "onetoone" function between D and N (or Z, it's the same). Let x be a member of D. Can one define the limit of a function f : D> R in x? What about differentiability? 

Title: Re: HARD: Differentiation Post by Icarus on Jan 30^{th}, 2003, 6:33pm on 01/30/03 at 02:06:34, dogfriend_ltk wrote:
Easily. After all the set of rational numbers is also countable. 

Title: Re: HARD: Differentiation Post by dogfriend_ltk on Feb 3^{rd}, 2003, 12:55pm Yeah, you're kindda right...I'm afraid the way I put the problem was wrong. Try this one... Let f : [0,1] > [0,1] be a continous, one to one function. Let A = { f(x)  f(y)  x, y members of [0,1]\Q }. Determine A. 

Title: Got IT Post by Nerd on Mar 30^{th}, 2003, 7:30am ;D Here's the solution: 2x = x + x 3x = x + x + x x*x = x + x +x ... x times (where x in x times is constant) therefore, d/dx (x*x) = 1*x = x derivative of constant is constant, so that x does not change. Nerd 

Title: Re: HARD: Differentiation Post by Icarus on Mar 30^{th}, 2003, 11:10am Sorry Nerd, but you haven't got it. "Constant" is a relative term. It is defined in situations where you have multiple variables which are not independent of each other. To say that a particular variable is "constant" in such a situation means that it's value does not change when then values of the other variables are changed. In this case the only variable around is x. And since the differentiation is with respect to x, x changes. The idea that "x is constant with respect to x" is nonsensical. Besides which, d(x*x)/dx = 2x, not x. The problem here, as you will see if you read through the thread, is that the puzzle tries to mix treating x as a discrete variable, defined only for integers (in the formula x*x = x + x + x +...+x x times) with treating x as a continuous variable, defined on the whole real numbers (in the differentiation). If you extend the concept of "adding x times" in the only reasonable fashion to include noninteger values of x, then you get the correct formula for d(x*x)/dx. If you instead define a discrete version D of d/dx, you discover that for it the formula D(x*x) = x is correct. It is only when you try to mix discrete and continuous in the same formula that you get garbage. 

Title: Re: HARD: Differentiation Post by nerde on Mar 30^{th}, 2003, 2:58pm Hey, Thats what I am saying... d (x(variable)*x(constant))/dx is x(constant).d/dx (x). i.e x 1 + 1 + 1..n times = 1*n (here n is a variable and 1 is constant). similarily 1 + 1 + 1..x times = 1*x. and x + x + x + .. x times = x * x. 1st x is constant and second x is not. I hope I cud have been a bit more clear. nerd. 

Title: Re: HARD: Differentiation Post by Icarus on Mar 30^{th}, 2003, 8:17pm Please read my whole post and not just the last 2 lines. You cannot have 1 x constant and the other not! They are the same thing! When you vary the value of x, EVERY instance of x changes  not just the ones that are convenient! Also d(x*x)/dx = 2x. IT DOES NOT = x, EVER! I wrote an equation saying you could define a discrete equivalent operation D so that D(x*x) = x, but there are two things you should note: (1) D is NOT the same thing as d/dx. The latter is not even definable for discrete variables. (2) My equation was wrong. I was misremembering something from the calculus of finite differences, and it was only after I posted that I really thought about it and realized my mistake. There are actually three such operators D, the formulas for them are: Forward Difference: D(x*x) = 2x + 1 Backward Difference: D(x*x) = 2x  1 Middle Difference: D(x*x) = 2x (The middle difference is the average of the forward and backward differences.) 

Title: Re: HARD: Differentiation Post by Ahmed on May 5^{th}, 2003, 1:36pm Hum. isn't Nerd right though, the fact that we are treating one of the x*x as a constent is where the error comes from? ??? 

Title: Re: HARD: Differentiation Post by THUDandBLUNDER on May 5^{th}, 2003, 2:04pm on 03/30/03 at 20:17:12, Icarus wrote:
Yeah, I think Icarus was a bit hard on Nerd there. He didn't even allow that d(x*x)/dx = x when x = 0 :o 

Title: Re: HARD: Differentiation Post by davut on Aug 16^{th}, 2003, 7:18pm what's wrong is the following: d/dx[ x + x + x + ... (x times) ] IS NOT EQUAL TO d/dx[x] + d/dx[x] + d/dx[x] ... (x times) because x is a variable. let's write the first equation as; to=x d/dx[ x + x + x + ... (x times) ] = d/dx[SUM x ] from=1 to=x which is NOT equal to SUM 1 from=1 because the upper limit of the sum is not a constant but the x itself. 

Title: Re: HARD: Differentiation Post by davut on Aug 16^{th}, 2003, 7:22pm somehow the formatting in my previous post was messed up. what i meant by the sums and limits are the following. to=x SUM x from=1 and to=x SUM 1 from=1 

Title: Re: Differentiation Disaster Post by anton on Dec 23^{rd}, 2004, 10:10pm Let f(x) = x + x + x + ... + x (x times) for integer x>=0 Using the definition of the derivative: f'(x) = lim_{h>0} (f(x+h)  f(x))/h = = lim_{h>0} (((x+h) + (x+h) + ...x times, since h>0... + (x+h))  (x + x + ...x times... + x))/h = = llim_{h>0} (h + h + ...x times... + h)/h = = 1 + 1 + ...x times... + 1 = x 

Title: Re: Differentiation Disaster Post by Grimbal on Dec 24^{th}, 2004, 7:31am The problem is that you don't say how you treat nonintegers. If the "x times" includes fractions, you also must compute (x+h) times, and you end up with f(x) = x^2. And the derivate is 2x. If the "x times" actually means "floor(x) times", the derivate is also floor(x), but you have a problem with a discontinuity if x is an integer. If you replace floor(x) by round(x), then your formula is true for integers. 

Title: Re: Differentiation Disaster Post by anton on Dec 24^{th}, 2004, 2:12pm Yeah, I see now that there is a flaw in my argument; the mistake was in assuming that (x+h) is added "x times, since h>0". Rounding x in "x times" resolves this, but then the formula is not exactly helpful, since x*round(x) = x^{2} only for integers; in general the function is completely different (thus the difference of derivatives). 

Title: Re: Differentiation Disaster Post by SomeGuy on May 19^{th}, 2005, 8:22pm I just wanted to point out that the 4th post in this thread gave an excellent, concise reason, and that every other post proceding it has thus been rendered redundant :/ 

Title: Re: Differentiation Disaster Post by Icarus on May 19^{th}, 2005, 9:11pm You could use a brushup on the meanings of both "preceding" and "redundant". Since many later posts concern matters not discussed or even suggested in Bojinov's post (#4), they are hardly redundant. In the 6th post, Nick points out a more fundamental understanding of the problem than Bojinov's explanation. 

Title: Re: Differentiation Disaster Post by prabhakar_misra on Jun 23^{rd}, 2006, 9:43am i think what is the heart of the matter is that we forget that d/dx is the rate of change of f(x) wrt x . its basically the slope of the graph of x^2 . thus you must note that the function x^2 might be equal to x.x but slope of x^2 is not x times the slope of x because the x that you think of multiplying after finding the slope of x has to contribute to the very finding of the slope. what i write next might not be very mathematical but it is worth a reading. when you make a dish that involves oil you can't sprinkle the oil after the dish is made as it has a role to play in the very making of the dish . i hope you all understand 8) 

Title: Re: Differentiation Disaster Post by Icarus on Jun 25^{th}, 2006, 12:41pm No one forgot the meaning of d/dx. We all know that the calculation is flawed, which is all your argument says. Yet the calculation appears to involve only basic rules concerning the derivative. The question was not "is this true?". The question was: "what went wrong in the calculation that resulted in this obviously false result?" 

Title: Re: Differentiation Disaster Post by Vespero on Aug 22^{nd}, 2006, 7:11am Uhm... x^2 = x*x = 'x times'(x) and (f(g(x)))' = f'(g(x))*g'(x) 'x times' represents the 'f' function. Deriving x*x banally as a sum of xes doesn't take care about composite functions' derivation. Right? 

Title: Re: Differentiation Disaster Post by Icarus on Aug 22^{nd}, 2006, 4:26pm "banally" and "doesn't care about" are not mathematical terminologies, and so it is unclear to me what you mean by this. 

Title: Re: Differentiation Disaster Post by THUDandBLUNDER on Aug 22^{nd}, 2006, 8:21pm on 08/22/06 at 16:26:15, Icarus wrote:
I have obtained a provisional decipherment: I think he is saying that d[x*x]/dx http://www.ocf.berkeley.edu/~wwu/YaBBImages/symbols/ne.gif d[x + x + x + ...(x times)]/dx because the latter does not give you the same answer as the product rule for differentiation when applied to d[x*x]/dx. [But the reason why not seems to have been lost in transmission.] 

Title: Re: Differentiation Disaster Post by Vespero on Aug 23^{rd}, 2006, 3:16am Actually did not know i had to use a precise mathematical terminology ;) Just tought i had to express myself with my logical and lingustic 'weapons' :) Uhm... Got no time right now to explain clarely what i mean, since i'm @ work. I'll try to take some moment later. A quick but unclear description of what i mean is that if you treat x no more as a variable, but as an actual parameter ('x times' statement implies you are kinda defining x) derivation rules you aplly will authomatically keep trace of it. Sorry, it almost is an intuition more than a demonstration... An example of it could be f(x)=nx (n being a parameter, obviously f'(x) = f'(x+x+x+x+x+..(n times).. +x)=f'('n times'(x))=n being f(x)=x*x and treating the first x as a parameter we would have f'(x)=f'(x+x+x+...x times...+x)=f'('x times'(x))=x Notice that first x in the one above is treaten as a parameter, and no more as a function, and coherently rules of derivation will result out parameter value. Is it more 'mathematically' comprehensible? 

Title: Re: Differentiation Disaster Post by Grimbal on Aug 23^{rd}, 2006, 8:43am on 08/23/06 at 03:16:11, Vespero wrote:
You might feel better once you have noticed how much the activity on this forum slows down on weekends. ;) 

Title: Re: Differentiation Disaster Post by Icarus on Aug 23^{rd}, 2006, 6:13pm on 08/23/06 at 03:16:11, Vespero wrote:
Logical and linquistic weapons are blunt and dull when no one can figure out what it is you are trying to say. Others have to be aware of the meanings you are applying with your words for the words to have power. Thus, when speaking about a mathematical conundrum, it is needful to speak in words that have a mathematical meaning known to your audience.;) But it appears you have the gist of some of the problem. When you differentiate, you vary the value of x, and this means the value of all x's in the expression. However, the "x times" calculation varies the value of one x while leaving the other constant (i.e., as a "parameter"). There is more, though. The calculation also treats one x as if it were discrete (i.e., an integer) while treating the other as continuous. But this can be overcome, as NickH demonstrates on an earlier page. The key failure is the "one x is variable, the other is a parameter" problem you've described. 

Title: Re: Differentiation Disaster Post by LaCiTy on Sep 4^{th}, 2006, 7:18am This pseudoriddle is a high offence to mathematics. This should not be is the hard section. 

Title: Re: Differentiation Disaster Post by Sjoerd Job Postmus on Sep 4^{th}, 2006, 2:02pm on 09/04/06 at 07:18:21, LaCiTy wrote:
Actually, somehow it should... finding the fault in an erroneous proof isn't that easy. And, pointing the fault out is even harder. Just saying "you can't do that" is easy. Finding out why is the interesting part. Alright, I'll see if I can think about it. We have a function f(x) = x^2... Now, we have another function. Let's split it up. g(x) = x * x h(x) = x [g(h(x))]' = g'(h(x)) * h(x) = 2(h(x)) * 1 = 2x So just plainly using the chainrule doesn't bother the outcome. Let's recite it again (just to get my mind thinking) f(x) = x^2 = x + x + x + x + x ... (x times) f'(x) = x' + x' + x' ... = 1 + 1 + 1 + 1 (x times) f'(X) = x Which is clearly erroneous, as we know the hopedfor outcome. 2x... Problem 1: in one, x is discrete. Solution: Add frac(x) to the equation f(x) = x + x + x ...(* floor(x))... + frac(x) Problem 2: One x is parameter, one x is variable Solution: None. We should actually write f_{x}(x) = x + x + x ...(* floor(x))... + frac(x) Which raises questions... Let's call the parameter x p(arameter) instead f_{p}(x) = x + x + x ...(* floor(p))... + frac(p) Which is equal to: f_{p}(x) = p*x The new formulation is totally different from the x^2 one. This causes us to get another result. f_{p}'(x) = p We exchanged the parameter x with p, so let's restore it. f_{p=x}'(x) = x (don't know if the previous equation is legal math notation, the p=x part, or if I should just use x) Let's recoup the problem: When going from x*x > SUM(1,x) (x) + frac(x), we have to do with two types of x's. One is a parameter (re: constant, like a and b in y=ax+b) and the other is variable. Instead of looking at one function, we are looking at a set of functions, f_{p}(x)... and are causing a lot of confusion by saying p=x... 

Title: Re: Differentiation Disaster Post by Icarus on Sep 8^{th}, 2006, 6:48pm on 09/04/06 at 07:18:21, LaCiTy wrote:
Mathematics is not a "person" to be offended or to take offence. And as a mathematician, I do not find this offensive at all. Instead, it should be taken in the same spirit as the various 2=1 "proofs" (in which form it could also be cast). By making use of a poor conception to produce an obviously bogus result, it challenges us to figure out what is poor about the conception, and therefore to correct our understanding. Indeed, when NickH first suggested the interpretation I will reproduce below, it actually surprised William Wu and several others who never thought about the solution in this fashion. As for being a "pseudopuzzle", it has puzzled many, so there is nothing "pseudo" about it. on 09/04/06 at 14:02:13, Sjoerd Job Postmus wrote:
No. This too has a solution, most of which you have arrived at (see NickH's post where he does the same thing). You can treat the other x as a variable too. What you get can be expressed this way, using your notations: f(x) = f_{p}(x)_{p=x}. More particularly, define the twovariable function F(x,p) = floor(p)*x + frac(p)*x to explain what we mean by "adding p x's together" when p is not integer (note that this differs from your expression by multiplying the frac(p) by x). In addition to the floor(p) x's everyone knows adds up to floor(p)*x, we add a fraction of another x according to the size of the fractional part of p. Then we find that f(x) = F(x,p)_{p=x}. To differentiate, we turn to the twovariable change rule: where the derivatives of F should be partials. dF/dx = floor(p)*1 + frac(p)*1 = p dF/dp = 0*x + 1*x = x Since p=x, dx/dx = dp/dx = 1. Hence df/dx = p*1 + x*1 = p + x = x + x = 2x. Hence the reason the problem fails is that it fails to account for the contribution of frac(p) in differentiating x + x + ... + x (p times). 

Title: Re: Differentiation Disaster Post by LaCiTy on Sep 22^{nd}, 2006, 7:01am I mean one can easily find more subtle appearant paradox with more precise formalisation. ex.: Let (Bt, t>0) be a standard brownian motion. 1) Bt is also a martingale, in particular, for all bounded stopping time T we have : E[BT] = E[Bt] = E[B0] = 0, by optional stopping theorem. (E denotes expectation of a random variable). 2) Now consider the stopping time : T := inf { t : Bt > a }, for fixed a > 0 3) One have, almost surely BT = a, by continuity of the paths (and the fact that T is finite almost surely) and thus E[BT] = a. 4) But applying optional sampling theorem ŕ time T yields : E[BT] = E[B0] = 0, and thus a = 0 which is absurd. Rem.: 1), 2), 3) are true statements. Rem.: All the information is provided with accepted formalisation. You do not need any particular background to suspect where the problem should come in. Modif. Ok, forget about it. Statement 4 is not true because the stopping time is not bounded, even if it is almost surely finite that's different. So optional sampling theorem does not apply. I think this would have been more simpler with a random walk. Nevermind. What I wanted to say is that I don't like to try to explain why something does not work when the true question is why it should work. Most of time a counterexample suffices without any farfetched explanation trying to create a deep reason when it is clear that theorems or definitions have been misused. 

Title: Re: Differentiation Disaster Post by Greg on Feb 18^{th}, 2007, 2:16am Let's review the statement of the riddle and point out where it went wrong. Then I'll offer an alternative to the faulty notation we'll discover. dx^{2}/dx = d(x_{1} + x_{2} + ... + x_{x})/dx = (dx_{1}/dx + dx_{2}/dx + ... + dx_{x}/dx) = (1_{1} + 1_{2} + ... + 1_{x}) = x  Note that I distinguish the various xes and number of xes in the proof by applying subscripts, and that the subscripts don't change the value of the xes. Also, I changed the notation of the derivative of something with respect to x. For example, d/dx[x] from the question now would now read dx/dx, which is more accurate to normal American mathematical notation. The problem is in assuming the second line. The derivative of x^{2} is not (dx_{1}/dx + dx_{2}/dx + ... + dx_{x}/dx). Many people probably induce the second line to be true based on the following accepted notation: x*n = (x_{1} + x_{2} + ... + x_{n}), where n is a constant. When we differentiate the function x*n with respect to x, we are in fact right in assuming that it is equal to the following: (dx_{1}/dx+ dx_{2}/dx + ... + dx_{n}/dx) = (1_{1} + 1_{2} + ... + 1_{n}) = n However, we cannot replace x for n in such logic, because this notation of deducing a derivative is not true if n is a variable that changes as x does. The basic notation for describing n*x as a series of sums may sometimes be convenient for simple proofs involving arithmetic (or any other mathematical technique that it does support), but that does not mean it can be treated like an ordinary operation or term. In fact, perhaps the notation we see above in d(x*n)/dx is simplified. By that I mean that the notation works only for constants because it is only part of the notation necessary for any term. Let's try to expand our notation to work for both variables and constants in place of n. We'll start by using the chain rule on the derivative of the function x*z d(x*z)/dx = dx/dx*z + x*dz/dx = z + x*d(z)/dx Then: dx/dx*z = (dx_{1}/dx) + dx_{2}/dx + ... + dx_{z}/dx) = (1_{1} + 1_{2} + ... + 1_{z}) = z And similarly: dz/dx*x = (dz_{1}/dx + dz_{2}/dx + ... + dz_{x}/dx) = x*dz/dx So that: d(x*z)/dx = d(x_{1} + x_{2} + ... + x_{z})/dx = (dx_{1}/dx + dx_{2}/dx + ... + dx_{z}/dx) + (dz_{1}/dx + dz_{2}/dx + ... + dz_{x}/dx) = (1_{1} + 1_{2} + ... + 1_{z}) + x*dz/dx = z + x*dz/dx if we allow z to equal x, we get the final set of equations: x + x*dx/dx = x + x = 2x Or from the error in the original problem: d(x_{1} + x_{2} + ... + x_{x})/dx = (dx_{1}/dx + dx_{2}/dx + ... + dx_{x}/dx) + (dx_{1}/dx + dx_{2}/dx + ... + dx_{x}/dx) = (1_{1} + 1_{2} + 1_{x}) + (1_{1} + 1_{2} + 1_{x}) = (x) + (x) = 2x. I propose that our initial assumption in going from line 1 to 2 of the proof of the riddle (going from the original proof's 2nd and 3rd equation) is wrong, and that it should instead be replaced by my above equation. Notice how it also yields the correct derivative of x*z with respect to x if z is a constant (simply z). I don't see how this problem touches upon the heart of any problem central to calculus. The notation used to represent and differentiate x^{2} was simply faulty. When corrected, the main feature of calculus that comes to my mind is the simplicity of the power rule. 

Title: Re: Differentiation Disaster Post by deolig on Jul 16^{th}, 2007, 11:31pm An alternative way of thinking of this problem is the following: The function f(x)=x^{2} is nonlinear, (ie when you plot x^{2} versus x, you obtain a parabola passing through the origin.) However in going from f(x)=x^{2} to f(x)=x+x+...x (xtimes), we have transformed the original function to a sum of linear functions, since by definition g(x)=x is linear. A sum of linear functions is itself linear. And thus there is a problem with this transformation. As has been pointed out in previous threads, this transformation may be correct for positive integers, but it cannot be applied anywhere else. Hence the error occurs in assuming that x^{2} = x+x+...+x (xtimes) is valid for all real numbers. 

Title: Re: Differentiation Disaster Post by Sameer on Jul 17^{th}, 2007, 10:50am on 07/16/07 at 23:31:19, deolig wrote:
What if you let those linear functions approach to zero and sum over it, will it be still linear? 

Title: Re: Differentiation Disaster Post by abcbcdcdef on Aug 24^{th}, 2007, 6:59pm it's quite simple actually, if I haven't got it wrong. d(c)=0, where c is a constant d/dx(x^2)=/=d/dx(x+x+x... (x times)) This (x+x+x... (x times)) can only work if x is a constant, but if it is, it immediately becomes 0. x^2 can only be a function and cannot be written as (x+x+x... (x times)) I know manny people have made this statement already, I just want to help make it clearer. :P 

Title: Re: Differentiation Disaster Post by srn347 on Aug 28^{th}, 2007, 12:15pm d/dx(x^2)=d/dx(x)...(x times). Applying it to the terms individually doesn't work. 

Title: Re: Differentiation Disaster Post by pex on Aug 28^{th}, 2007, 12:31pm on 08/28/07 at 12:15:36, srn347 wrote:
Clearly; but the point of the riddle is figuring out why it doesn't work. 

Title: Re: Differentiation Disaster Post by Gary on Dec 8^{th}, 2008, 3:59pm Anton was nearly correct. He was the only one who went to the definition of the derivative. f'(x) = lim h >0 [f(x+h)  f(x) ] / h This definition doesn't fail you not even here (integer or not  not under consideration). f(x)=x^2 = x + x + x + ...+x ; (x times) f(x+h)=(x+h)^2 = (x+h) + (x+h) + ...+ (x+h) ; (x+h times) The "x times" that Anton had in his first post was incorrect and he caught it but didn't review it. f(x+h)  f(x) = (x+h)  x + (x+h) x ... ; x times + x+h + x+h ... ; h times = h + h + h + ...+h ; x times + x+h + x+h + ...+ x+h; h times We're smart enough to fold that back together... f(x+h)  f(x) = h*x + (x+h)*h = 2h*x + h^2 (shoot, we could have guessed that). divide by h and let the limit of h go to zero and we get.... 2x. Go back to the definitions and they'll work for you. Gary 

Powered by YaBB 1 Gold  SP 1.4! Forum software copyright © 20002004 Yet another Bulletin Board 