How do you understand calculus? I can use the equations but I don't get the concept at all

How do you understand calculus? I can use the equations but I don't get the concept at all.

Other urls found in this thread:

youtube.com/playlist?list=PLZHQObOWTQDMsr9K-rj53DwVRMYO3t5Yr
twitter.com/NSFWRedditVideo

By learning it in a pure maths analysis course instead of a brainlet engineer course.

Integration is just adding up little squares on graph paper -- but it's a lot faster.

Is it integration or differentiation which you "don't get"?

take classical mechanics.

He wants to learn something for practical use not fairytales.

Mathematicians only learn calc to enlighten themselves in how their own farts don’t smell. People with a use in life like engineers learn calc and use it to understand how things change.

you plug and chug and pass until you can take analysis

Differentiation is like more precise division.
Think about speed as you're driving. If you're "going 50 miles per hour," that doesn't mean you've actually spent an hour driving at the same speed and have now covered 50 miles, it just means if you were to maintain that speed for an hour then you would cover 50 miles.
Another way of writing "miles per hour" is "miles / hour," as in distance divided by time.
In order to find out exactly how fast you were going at any given point in time that you've driven, you need something better than division which would only tell you the average speed you were going at given a total distance and a total time. So instead of just knowing you must have traveled at an average of 35 mph because you covered 35 miles after 1 hour, you'd be able to know 5 minutes into it you were going at 45 mph, or that 20 minutes into it you were stopped in traffic and had 0 mph for speed.
So if you already have all the information of distance covered at each point in time, you can call that information a function named f(t) that returns a distance for each input t of time e.g. f(5 minutes) = 2 miles if you were 2 miles into the trip after 5 minutes.
A derivative function for f(t) could be written as f'(t) that takes the time and returns the speed at that moment e.g. f(5 minutes) = 45 miles / hour.
The relationship of speed to the original function of distance covered in a given time is that of the slope. If you measure the slope of the original distance function at the 5 minutes mark, the amount of distance (y axis movement) divided by the amount of time (x axis movement) would be your speed, and the shorter the interval between two points to measure your slope, the more information you'll have about exactly what the speed was at every point long the way.
The special innovation of calculus was finding out you can work with the concept of this interval between points being small enough to approach 0 i.e. maximum precision.

youtube.com/playlist?list=PLZHQObOWTQDMsr9K-rj53DwVRMYO3t5Yr

Watch this guy.

>How does one understands a broath field with different techniques, principles and approaches.
If you can't pin point what exactly you don't understand, just means you were a lazy fuck that didn't study enough.

PS: The reason why this was an innovation and not just obvious is that you can't treat the vanishingly small interval as literally equal to 0 or else you run into problems right away with the undefined division by 0 operation. Your slope calculation (rise over run) is:

f(x+h) - f(x)
/
x + h - x

If you treat h (the interval between the points you're trying to find a slope for) as 0 to begin with, you get:

0 / 0

Which of course isn't helpful. Instead, you maintain the notion h has a nonzero value, but it's such a small nonzero value that you can eventually throw it out and end up with your answer. e.g. if f(x) = x^2 then you can substitute the f(x)s above with (x^2)s and your slope calculation becomes:

(x + h)^2 - x^2
/
h

(x + h) * (x+h) - x^2
/
h

x^2 + 2xh + h^2 - x^2
/
h

2xh + h^2
/
h

2xh / h + h^2 / h

2x + h

And at that last step now you *can* safely treat the h as 0 since now you'll just be adding 0 instead of trying to divide by it. Meaning the derivative function of f(x) = x^2 is 2x.

calculus made easy

>using 'h' and not 'dx'

>>using 'h' and not 'dx'
>thinking 'dx' is a variable and not an actual mathematical object

Using dx in that example would've been fucking stupid. God I hate you physishits.

You can use whatever the fuck you want, it's literally the symbol for "negligibly small amount" you're talking about here.

>"negligibly small amount"
So OP's dick size?

what an absolutely useless post. you should be executed by firing squad for making it

t. Brainlet/dicklet hybrid.

do you actually know how to form sentences or do you just stitch memes together that you found on reddit? are you a real person? am i talking to a bot?

r/funny seems more your speed

He clearly used the definition of the derivative as a limit of difference quotients. You don't use whatever the fuck you want there, you use a fucking variable, h being the standard.

found the brainlet. hint: it's you.
dx is not "negligibly small amount" nor "infinitesimal" nor any other handwavy crap that was used 300 years ago. dx is a differential form, educate yourself before speaking pls

>being this new
Christmas vacation is hell when it comes to redditors coming here.

dude your iq is literally sub 100. everything you have said is a fucking lazy rehash of some shitty meme.

>heheh someone said "small" ill seize this golden opportunity to make a dick joke!
>and at OP's expense too! Op is a fag XD
>t.
>*let
>hehehe le being this new (this is what everyone says to me so ill try using this epic meme now!)
>haha lol christmas vaction! i hate summer fags!!1 fuck reddit!!

there is a mentally retarded paste eater collaging different memes together he found to write these posts. can you please show me your attempt at formulating a sentence without copy pasting memes so i can laugh at you?

I never understood why anyone cared this much about limits vs. infinitesimals. It seems like a lot of effort just to say the same thing in a slightly different way. Also nobody even seriously argues that infinitesimals are a problem anymore, you can use them rigorously, it's just that they weren't used in a rigorously defined way earlier in history.

It's called banter you fucking faggot. Let off the autismo and relax a little. It's Christmas Eve, people literally are shitposting tipsy here ( i am).

I’ve always found that knowing the difference between the incorrect application of concepts and simple errors during calculations to be very helpful in learning calculus.

You may know each of calculus’s concepts or even memorized every formula, definition and/or theorem but something as simple as neat hand writing or more effective forms of expressing the same equation can trip you up.

A lot of mistakes and misconceptions come from bad notation and intuitive arguments. If you are having troubles with some arguments, putting everything in a good notation. Properly knowing how to use the chain rule and what u substitution is helps a lot when talking about general differentiable functions. Try deriving a general representation of a curve (not necessarily parametrisized by arc length) in it's own frenet serrat frame will make absolutely no sense if you don't take a bit of time to actually put everything properly and with the correct notation.
>who cares about that lmao just give the formula XD
Anyone with the slightest interest in math cares about proper arguments. A lot of engineers and chemists who enjoy math have asked me about rigorous proofs and constructions as anyone who pays attention to this shit can detect problems arising. I mean, I'm particularly fond of rigorous mathematics but almost everyone in my QM class called bullshit when they said everything was just lin alg XD. That doesn't mean you shouldn't learn, considering functional analysis is such a fucking mess, but just being interested in formalism should come natural.

But you can rigorously define infinitesimal based calculus, Abraham Robinson proved that like 50 years ago.

But it's an algebraic mess that is defined in a way that's done such that calculua works, but the epistemological maxim is still in something you can, in some sense, geometrically construct. It's like trying to teach differential forms just like that. Algebraic constructions are easier, but lack intuition.

You can use infinitesimal rigorously but limits and infinitesimals are completely different beings. Infinitesimals work fairly well for "intuitive" calculus, but there are tons of pathological examples where infinitesimals really shit the bed.

So what's the problem? It really fucking sucks when you come up with a nice intuitive and useful theorem that seemingly works for everything that actually matters, but then some Weier fucking strass slaps your theory gf's ass with some nonsense counterexamples and you're left there crying unable to do anything, while everyone is too afraid to use your results to derive any further results and only some physishits use the theorem to approximate some pointless trivial shit.

Doesn't dx just imply an infinitely small change in x? Hence why a derivative is an infinitely small change in the output (f(x+dx)-f(x) or df) divided by an infinitely small change in the input (dx)?

How can you rigorously define infintly small? As leibniz did it's just a "number" that is greater than 0, but smaller than every other postive number (which doesn't exist). It's also idempodent (it's square is 0) plus other weird shit. This works, but why not properly define what does it mean for an indexed set to approach a value using inequalities that can be perfectly translated to the plane.

The hyperreals are rigorous so long as the reals are.

t. Piper Harron

Yea, but their construction is abstract, and is done in order to work with previous calc results. Try explaining the transfer principle to some poor fuck who's still doesn't understand trig.

>delta/epislon and open balls over the reals aren't abstract and easily understood by a trig student
get out

Watch professor leonard

>painting open intervals (open balls) on a fucking board is not intuitive
Are you fucking retarded?

You are obviously so handicapped that you can’t distinguish between what you and I understand and what the aforementioned trig student can understand.
Perhaps that’s because you’re a winter break underage b& larper pretending (pathetically) to be a college student.

What the fuck are you talking about? I don't expect to teach them topological metric spaces, just follow something similar to spivack...

>spivack
>I am superirorrr to leibniz

Calculus deals with limts as you try to multiply 0 by infinity or divide 0 by 0.

...

Well, he is not really wrong you know

Fucking rekt. Kek

That wojak is excessive

OP [pic related] is all you need to know for the rest of your life. It helped me so much.

nah, this is just called giving up and becoming a soulless stamp collector.

literally who ? and what a shitty quote.

0/10 GET OUT

>Alexander Grothendieck
>literally who?

Understanding basic concepts deeply and being able to explain them with ease without trouble.

>h being the standard
Brainless itt don't know about them hyperreals

It's called non-standard analysis for a reason. If you don't explicitly state that you're now going to suck Abraham Robinson's cock, everyone expects you're doing standard real analysis, no hyperreals allowed.

No it's not you dumb fuck

>it’s even easier to trigger Veeky Forums than /pol/

I don't think that quote is about shallow understanding of math. What do we usually consider by understanding anyway? I think its no more than reducing some abstract concept to something that we "experienced" in some elementary way (by senses). For example you "understand" what a flower is because you saw it many times, you smelled it etc. Now I ask you what is "flowdudo" (I just made this word up)? You don't know, but I explain to you that its just a dude that has flowers growing from his head, instead of a usual hair. So now you "understand" what a flowdudo is.

But what happens when a new irreducible concept like a flower is introduced (for the sake of the argument let it be irreducible - or maybe we are just not creative enough to decompose it - however assume its irreducible)? What you do is you smell it, observe it, play with it, in another words get SO familiar with it that you feel like you "understand" it, and later you can understand something more abstract ("flowdudo") in terms of it.

So I guess math objects can get very abstract, and it might not be the best to always try to understand something abstract in terms of something you already "understand" (which is not anything glorified - you are just very familiar with it). Instead you should observe its properties and play with it logically (mathematical equivalent of senses in physical world).

This is such a pointless conflict, they both say and mean the same basic things. Limits have to be one of the most autistic mathematical ideas ever.
>We can get useful information by working with infinitesimals, but infinitesimals don't exist because reasons so let's write the same exact statement and call the infinitesimal part a small but totally still real distance that's only "approaching" 0, but wait infinitesimals were fine all along and now we're stuck with limits because everyone's used to them.

Describe me how to prove the mean value theorem without limits, using infinitesimals instead. Are you man enough to solve it without looking up how Robinson proved it?

Do you believe many people would be able to independently come up with the mean value theorem *with* limits?

You've confused the utility and meaning of calculus with its rigor.
You can make it rigorous with epsilon/delta or with infinitesimals. Either way, it's rigorous and even in the absence of that rigor, its utility is unmatched.

No doubt. The mean value theorem was well known for long before it was proven rigorously. Many people tried to prove it with the At-the-time -definition of infinitesimals, but none succeeded because some asshole always came up with some pathological counterexample disproving it. Once the concept of limits became a thing, it was proven very quickly. It took over a century for someone to be able to prove it with infinitesimals and even then they had to layout a new foundation for it to work.

The rigour is very important. We knew about the mean value theorem for long, but because it wasn't proven rigorously, no one wanted to derive any further theoretical results based on it. Thus only physishits and other shitters used it for mundane shit. Once it was proven rigorously, analysis leapt forward like never before.

Limits literally saved Analysis, while infinitesimal shat the bed and made it a laughing stock.

>best self-refuting comment of the day

Fuck off, Abraham, Infinitesimals were shit back then and the utility of calculus was geometry tier before limits.

Infinitesimals before the 20th century were a fucking mistake and if you defend them, you're a brainlet with the brainsize of an infinitesimal.

>Thus only physishits and other shitters used it for mundane shit.
So everyone who did useful work with real world applications were able to use it?
And only people who focused on pure maths without real world applications had a problem with it?
So literally the only use for limits was to satisfy autism?

Hiperbolic geomerry wouldn't be a thing without autistic search for rigorous geometry.

What specific practical technology would be impossible to implement without hyperbolic geometry?

ITT: bikeshedding

Partial list of major scientific achievements between the invention of calculus and Cauchy's Cours d'Analyse (1821):
>In 1821, Seebeck observes semiconductor behavior
>1800: Volta invents the battery
>1789: Lavoisier discovers the law of conservation of mass, founding modern chemistry
>1763: Bayes' Theorem
>1687: Theory of gravity
Between Cours d'Analys and Weierstrass(1854):
>1827: Ohm's law
>1830: Non-euclidean geometry
>1838: cell theory (biology)
>1842: Doppler effect
>1846: Discovery of Neptune
Between Weierstrass and Dedekind(1872):
>1859: Evolution by natural selection
>1861: Germ Theory
>1864: ***user totally BTFO***: Maxwells Theory of Electromagentism
>1869: Periodic Table

Yeah, science was at a standstill until the last needs of the rigor of Calculus were met!

All of that shit was just pseud-bullshit no one gave a shit about before our god and saviour, Cauchy, saved the day.

>Lavoisier discovers the law of conservation of mass, founding modern chemistry
Nope. Mikhail Lomonosov in 1756.
Libtards like to take credit for the earlier discoveries made by better men. See the list of Einstein plagiarisms for another common example of leftists stealing the work of others.

Nobody prior actually worked out what Einstein did with special relativity and general relativity. The "plagiarism" examples all involve shit like Poincaré writing in a non-formal philosophy paper about the concept of relativity, which is completely different from doing the work of discerning and publishing the actual equations that define it.
Mathematically, Poincare never abandoned the use of "aether," and while he was clearly aware it was a sloppy bandaid fix that would go away in the future when someone figured out the real answers, he was also pretty clear that someone wouldn't be himself:
>It matters little whether the ether really exists: that is the affair of the metaphysicians. The essential thing for us is that everything happens as if it existed, and that this hypothesis is convenient for us for the explanation of the phenomena. After all, have we any other reason to believe in the existence of material objects? That too, is only a convenient hypothesis; only this will never cease to do so, whereas, no doubt, some day the ether will be thrown aside as useless.
And that someone in reality was of course Einstein.

i enjoyed this user, well said

>t. baby first analysis class veteran

Wait until you discover Asymptotic Methods and Perturbation Theory and learn to accept the unrigorousness.

>How do you understand calculus?
I don't know. I understood the fundemental theorem of calculus bit I memorize shit really fast. So I would sleep in Calc 1 and I still easily did well. The wrost in Calc I did was 3 at UC Irvine. I just never went to calss and crammed for the final and I got a C. I don't understand shit in Calc. Most of it seemed like methods, heuristics, and other shit that wasn't real, so I just memorized it right before tests.