What's the point of analytic differentiation when you can just use numerical differentiation on basically anything...

What's the point of analytic differentiation when you can just use numerical differentiation on basically anything without thinking about it? Shouldn't it be retired along with other outdated shit like encyclopedias or the abacus?

Other urls found in this thread:

en.wikipedia.org/wiki/Gaussian_quadrature#Gauss.E2.80.93Legendre_quadrature
en.wikipedia.org/wiki/Shanks_transformation
graphics.pixar.com/library/
twitter.com/SFWRedditImages

Computers are dumb. They'll analyze anything you throw at it. You have to know your input is legit.

Nobody's asking computers to come up with your input for you, I'm talking about output i.e. solving.

"What's the derivative of x^2?" "2x."

vs.

"What's the derivative of x^2?" "For which value of x?"

The first question isn't necessary if you have the ability to provide the answer for every x of the second question. The whole point of having an answer to the first question is being able to answer the second question, and numerical differentiation makes it so you don't need to know what that exact new derivative function is.

Lel, you will fail basic integration with that mindset.b

Literally just use trapezoids.

>he doesn't use tai's method

Its ridiculously computationally expensive to get a limit of x^2 repeatedly rather than just calculating 2x

It depends on what you're using it for. Unless you need it to constantly execute once a second or something it probably won't matter.

You have to understand what it's doing for you, otherwise it's just a black box. If something goes wrong, how will you troubleshot if you don't understand the basic subroutines? How will you know if something is wrong with your output in the first place?

Aside from that, there are situations where you're not working with any particular set of numerical values but you'll still need to compute derivatives (e.g. the derivations of various scientific laws, transformations of differential equations, etc.)

>You have to understand what it's doing for you, otherwise it's just a black box.
So it's only as good as all the recent wildly successful artificial neural network applications including self-driving cars?

While finite differences of order n reconstruct polynomials of degree n perfectly (pretty much the same as taylor series), they don't lend themselves very well to proving analytic results.


Just use Gaussian quadrature.
en.wikipedia.org/wiki/Gaussian_quadrature#Gauss.E2.80.93Legendre_quadrature

(midpoint is better than trapezoid btw)

Extrapolation methods do wonders.
Check out Shanks transformation.
en.wikipedia.org/wiki/Shanks_transformation

>Check out Shanks transformation.
Pretty cool, thanks user.
>(midpoint is better than trapezoid btw)
Yeah, I never thought trapezoid was the best, just liked that it was easy enough for me to understand and program iterations of from scratch.

If you're doing something like game graphics, calculus is being done non stop. Finding the normal vector on a curved object is pretty much only finding 2 different tangents then finding the cross product.

Gauss Legendre of order n tells you at which points you must sample a polynomial of degree n to get the exact answer for integration (your interval must be re-scaled to [-1,1] of course).

I don't know how common it is for people to reinvent low level game graphics methods anymore.

Satanic trips, but also what about something like differential equations? What if you are given something like y=f(y'), and y(t0)=C, and they want you to solve y(t0+100000)? Are you going to use newtons method, instead of just solving it and jumping right to the solution?

What's wrong with just having a program that implements Newton's method and then applying it twice? Whatever you lose in computational efficiency for that specific problem you more than gain in time saved by automating the entire universe of problems of that class.

If you're okay with the "throw shit at a wall until it sticks approach", then yes.

Because
1.) Newtons method introduces error that could make your calculation useless. Suppose you tried newtons method on y=-y'', given y(0)=0,y'(0)=1, could you really calculate y(5000*pi) to any meaningful accuracy? Because by proving y=sin(x), you can have a closed formula.

2.) Extrapolation is incredibly computationally intensive compared to simply plugging in the solved formula. Moreover, usually a given problem will have a somewhat constrained problem set to deal with, so its much easier to just solve the given set (aka copy textbook information into a coding language)

Check out the Pixar papers - chock full with all sorts of lovely calculus

graphics.pixar.com/library/

We wouldn't have such good graphics today without those types of deliberate attempts to model the real world using calculus

Analytic solutions will always be useful because you can create a base of new mathematics with them.

Normally this new set of math is to describe some physics or engineering problem.

With that mindset, I guess finding any root(s) of the Zeta function is impossible now. Thanks for hindering mathematical progression.

For applied purposes numerical differentiation is fine and I'd say an interesting subject everyone should study. But numerical differentiation would be impractical for pure purposes. Here is an easy example:

Suppose you have two functions f(x) and g(x) defined around 0. Suppose that f(0) < g(0) and that (g(0) - f(0))' > 0. Those two inequalities imply that there exists some interval around 0 where g(x) > f(x).

This may be trivial but this is really useful. And you can conclude that immediately for the normal derivative. But for numerical derivatives you would first have to checks the bounds of its errors and check if you already took the derivative "well enough" to actually conclude the inequality. Not practical.