Calculus/Analysis proof

I was thinking about an interesting question:
If two functions f and g are equal at a point x, does that mean their derivatives are equal?

At first I thought yes, this is intuitive, but then I realized that if the two functions are only equal at x, then this doesn't hold. Then I thought, what if they are equal inside of an interval. Say [a.b]. Will it hold. Well, I haven't done analysis but I have taken calculus so I wanted someone to check my proof.

Theorem: Let f and g be two functions that are at least equal in the interval [a,b] and differentiable in the interval (a,b). Their derivatives will also be equal at least in every point inside (a,b).

Proof: First, lets pick an arbitrary c inside of (a,b). We know that the following limit must exist:
[math] \lim_{h\to 0}\frac{f(c + h) - f(c)}{h} = k[/math]

As f(c) = g(c) I will immediately replace it to get:
[math] \lim_{h\to 0}\frac{f(c + h) - g(c)}{h} = k[/math]

Now this, by definition, means:
[math] \forall \epsilon > 0 \exists \delta > 0[/math] such that [math] 0 < |h| < \delta \implies | \frac{f(c + h) - g(c)}{h} - k | < \epsilon [/math]

So this means that there exists some mapping from epsilon to deltas. Lets suppose
[math] \delta = w( \epsilon ) [/math]
Now consider the set:
[math]E = \{ \epsilon [/math] such that [math] w( \epsilon) < min(|-c+b|,|-c+a|) \} [/math] where min takes the minimum of the two numbers.

And define the new mapping
[math] w_2 ( \epsilon ) =
\left\{
\begin{array}{ll}
w( \epsilon ) & \mbox{if } \epsilon \in E \\
min(|-c+b|,|-c+a|) & \mbox{if } \epsilon \notin E
\end{array}
\right. [/math]

From this definition we get that [math] w ( \epsilon ) \geq w_2 ( \epsilon ) [/math] for all epsilons.

Which means that we can freely replace the new deltas found by this mapping and the limit will still hold. So:

[math] \forall \epsilon > 0 \exists \delta = w_2 ( \epsilon ) > 0[/math] such that [math] 0 < |h| < \delta \implies | \frac{f(c + h) - g(c)}{h} - k | < \epsilon [/math]

Cont.

Other urls found in this thread:

math.stackexchange.com/questions/53209/can-two-functions-have-identical-first-and-second-derivatives-but-different-hig
twitter.com/NSFWRedditImage

Cont:
But if h is smaller than delta, and the new deltas are defined to ensure that when added to c, the result will be inside of [a,b] then for all h, f(c+h) = g(c+h). So:

[math] \forall \epsilon > 0 \exists \delta = w_2 ( \epsilon ) > 0[/math] such that [math] 0 < |h| < \delta \implies | \frac{g(c + h) - g(c)}{h} - k | < \epsilon [/math]

But what this proves is that:

[math] \lim_{h\to 0}\frac{g(c + h) - g(c)}{h} = k[/math]

Proving that the derivative at any point of (a,b) of both functions will be exactly the same. [math] \blacksquare [/math]

So, how good is this? Is this too weak logically? Could it be too much for such a simple result? Please tell.

Let f (x) = x
Let g (x) = -x

They have the same value at (0,0)
The slope at (0,0) and all other points is 1 and -1 respectively.

Nice job reading literally only the first sentence when I bothered to latex all that shit.

Anyways, I realized that too but I used instead x and 2x in my mind.

I am more concerned about the case when the functions are equal not only at an isolated point, but in an interval around a point.

Bump.

I am mostly confident in that this proof is alright, but at the same time it is pretty long for such an obvious result. What is the trick will do this in like 3 lines?

if f,g are equal in [a,b] then for x in [a,b] and h small enough so that x+h is in [a,b] you have
[f(x+h)-f(x)]/h=[g(x+h)-g(x)]/h

taking the limit gives f'(x)=g'(x)

>h small enough so that x+h is in [a,b]

Yeah, this is the core idea but I wonder if it is okay to do it as quick as you. To just say "let h be small enough, then f(x+h) = g(x+h).

First because depending on how close x is to the limits of [a,b], h may need to be a lot smaller than if x was in the midpoint of [a.b].

Also I worry because f(x+h), in the context of a limit, is not really a number. It is an expression.

I can replace f(c) with g(c) without a second thought because in reality, f(c) is no more than a constant. So there is no problem with calling that constant by its other name, g(c).

But when you have the variable of the limit inside I feel like you have to be more careful. Or not?

>First because depending on how close x is to the limits of [a,b], h may need to be a lot smaller than if x was in the midpoint of [a.b].
if f=g in [a,b] then [f(x+h)-f(x)]/h=[g(x+h)-g(x)]/h for all h with h< minimum{|x-a|, |x-b|}

f(x+h) really is a number since x and h are numbers and f is a function

>Then I thought, what if they are equal inside of an interval. Say [a.b]. Will it hold.
at the very least the FIRST derivative should be equal, anything past that is not guaranteed.

>First because depending on how close x is to the limits of [a,b], h may need to be a lot smaller than if x was in the midpoint of [a.b].

Assuming you are just working in R or something, this is always possible by the Archimedean property.

>Also I worry because f(x+h), in the context of a limit, is not really a number. It is an expression.

You are already assuming that f(x) is defined in [a,b]. So if h is small enough such that x+h is in [a,b], then f(x+h) is defined. This works for all h and in particular as we take h -> 0.

>I can replace f(c) with g(c) without a second thought because in reality, f(c) is no more than a constant. So there is no problem with calling that constant by its other name, g(c). But when you have the variable of the limit inside I feel like you have to be more careful. Or not?

Nope, x+h is just a number the same way x is.

>for all h
For all h such that x+h is in the interval

edit: can you find a pair of functions that is EXACTLY equal to each other within an interval [a,b] that are NOT piecewise?

I think thats a key to solving this problem.

>f(x+h) really is a number since x and h are numbers and f is a function

But h is not a number. h is a variable. I mean, if h was a number then which one is it? Is it 0? Not really. If you just replace h=0 then you get 0/0

>You are already assuming that f(x) is defined in [a,b]. So if h is small enough such that x+h is in [a,b], then f(x+h) is defined. This works for all h and in particular as we take h -> 0.

Okay, I get this but how would you execute this then. If you were to write this proof how would you do it?

>at the very least the FIRST derivative should be equal, anything past that is not guaranteed.

By if by this theorem, all the first derivatives are equal in all points (a,b) then we can now find a closed interval contained in (a,b) and do the same for the second derivative, then for the third, then for the fourth, etc.

Idiotic.
>Let f and g be two functions that are at least equal in the interval [a,b] and differentiable in the interval (a,b).
This is equivalent to saying that there exist two functions, say f and g, such that for all x in the closed interval [a,b] we have f=g. If thats the case, it's already given to us that f'=g' on (a,b).

>edit: can you find a pair of functions that is EXACTLY equal to each other within an interval [a,b] that are NOT piecewise?

Yes, the typical:

(x+1)(x+2)/(x+1)

and x+2

They are equal everywhere except at x=-1 where one doesn't exist.

Still, I don't see the problem with the functions being defined piecewise. Functions are just mappings.

> If thats the case, it's already given to us that f'=g' on (a,b).

How is it already given to us? Nothing is already given to us. Source?

The last part of is meant for you.

>But h is not a number. h is a variable. I mean, if h was a number then which one is it? Is it 0? Not really. If you just replace h=0 then you get 0/0
if you want to think of h as a variable, given x in (a,b), let m=minimum{|x-a|, |x-b|}.

then for any h satisfying 0

>Yes, the typical:
>(x+1)(x+2)/(x+1)
>and x+2
>They are equal everywhere except at x=-1 where one doesn't exist.
are you retarded?

its supposed to be equal ONLY within a certain interval.

not equal everywhere except an interval.

>then for any h satisfying 0

>not equal everywhere except an interval.

But it is the same.

If the two functions are equal at either side of -1 then you can pick the interval [0,infinity] and apply the theorem there.

I am just giving an example, man. Jesus.

>By if by this theorem, all the first derivatives are equal in all points (a,b) then we can now find a closed interval contained in (a,b) and do the same for the second derivative, then for the third, then for the fourth, etc.
doesnt work because as long as the two functions are unequal outside of that interval [a,b] then eventually the derivatives will no longer be equal.

unless we are dealing with a piecewise function.

>But it is the same.
no its not, stop being a nigger.

i don't know what your w is in w(epsilon)

>Is it okay to simply talk about possible h's without taking about the delta that bounds them?
i dont know what you mean by this, but as long as you can find an interval around x so that [f(x+h)-f(x)]/h=[g(x+h)-g(x)]/h for every x+h in that interval, you're good to go to claim the derivatives are equal

as long as you can find an open interval *

I found a discussion of this topic here:
math.stackexchange.com/questions/53209/can-two-functions-have-identical-first-and-second-derivatives-but-different-hig

They say that yes, the higher order derivatives will keep being equal, but the comments are pretty interesting. You might want to read.

It is a trivial case of the theorem.

the w is just a name I picked for the function.

I originally said that let delta = w(epsilon). As in, w is the mapping from epsilon to deltas that "proves" the limit exists (which is a given). You know, when you want to find that a limit exists you do epsilon delta and find that delta = epsilon/2 or something like that. Just a generalization of that.

>i dont know what you mean by this, but as long as you can find an interval around x so that [f(x+h)-f(x)]/h=[g(x+h)-g(x)]/h for every x+h in that interval, you're good to go to claim the derivatives are equal

I am sure this is true, I just wonder if it is rigorous. Specifically:

Would a proof that does not invoke the epsilon delta definition of limits be rigorous?

>They say that yes, the higher order derivatives will keep being equal
lower derivatives can be numerically equal as you stated but become unequal later on.

the stackexchange you linked is if the lower order derivatives are EXACTLY the same which is not what you were asking.

good try though.

Yeah, I see.

It is weird though.

It should be true if you just keep applying the original theorem.

As in, imagine if they functions were equal in an interval larger than [a.b]. Say, [a-1,b+1]

Then this would mean that the first derivatives are equal in (a-1,b+1), which contains the interval [a - 0.5,b + 0.5]. Reapply the theorem to get that the second derivatives are equal in (a - 0.5, b + 0.5). Then get the interval [a - 0.25, b + 0.25] and repeat.

Keep repeating, keep repeating and then by induction you could prove that all derivatives are equal in [a,b]. All it takes is that the original functions are equal in a set a little bit bigger than [a,b]. Could even be [a - 0.00000000000001, b + 0.00000000000001]

>Would a proof that does not invoke the epsilon delta definition of limits be rigorous?
yes because you already have
[f(x+h)-f(x)]/h=[g(x+h)-g(x)]/h

so lim [f(x+h)-f(x)]/h = f'(x)

so lim [f(x+h)-f(x)-hf'(x)]/h=0

so lim [g(x+h)-g(x)-hf'(x)]/h=0

so lim [g(x+h)-g(x)]/h=f'(x)

but lim [g(x+h)-g(x)]/h=g'(x)

so f'(x)=g'(x)

Are you saying:
Let f,g be functions such that f(x)=g(x) for all x inside [a,b].
Then f'(x)=f'(g) for all x inside (a,b).

Because that is how I understand what you were saying. But this is obivously true, if you just consider:

[eqn]lim_{x\rightarrow x_0}{\frac{f(x)-f(x_0)}{x-x_0}}[/eqn]

which for all x inside (a,b) is by the asumption equal to.

[eqn]lim_{x\rightarrow x_0}{\frac{g(x)-g(x_0)}{x-x_0}}[/eqn]

Which obviously implies g'(x)=f'(x).

I guess you could use this to approximate trig functions.

But most of the time, I think this just shows that f = g

f = g is the trivial case. As in, they are equal in all the real numbers.

I am considering a tighter case in which they are only equal inside an interval [a,b] and are completely different outside of [a,b]

Then you just introduce f1 and g1 that are the restricts of f and g on [a;b].

So f1 = g1, and everything follows.

saged

This is trivial, if the derivative was different at any x, then there should be a value at which f(x+h)=g(x+h) doesn't hold (for equal, small value for h). And thus f and g would not be equal.

>sage is a downvote
>saging one of the few threads here with an actual discussion
okay

Brainlet here, but i think this is what is missing here

[math]\epsilon[/math]