Mathematics general

What are you guys up to?

Exams are soon, so I am just studying away and (trying to) finish up some assignments early so I can do some extra exercises. In particular, Lie algebras. Looking forward to the break when I can do a bit more on what I want to learn.

Other urls found in this thread:

en.wikipedia.org/wiki/Picard–Lefschetz_theory
msri.org/people/members/defthy07/exercises/exercisesday4.pdf
en.wikipedia.org/wiki/Paul_Erdős#Erd.C5.91s.27_problems)
en.wikipedia.org/wiki/Hutter_Prize)
youtube.com/watch?v=sMqI1esWpEs
youtube.com/watch?v=zzASv4G9bNA
a.pomf.cat/jaxtzr.gif
a.pomf.cat/xmtveg.gif
a.pomf.cat/qyzslx.gif
anagrams-seminar.github.io/hdr/kahler.pdf
ihes.fr/~celliott/D_modules.pdf
math.harvard.edu/~gaitsgde/grad_2009/
twitter.com/SFWRedditGifs

What are Morse functions again? I remember hearing about them years ago in a lecture on the proof of the Gauss-Bonnet theorem.

>mathematics general on a math board
fuck off with your thinly veiled blog thread

Morse functions are functions which have only non-degenerate critical points. We use them to study the topology of smooth manifolds using Morse theory.

the other 'math threads' on here are like borderline grade school material

the complex analog is nice too
en.wikipedia.org/wiki/Picard–Lefschetz_theory

Indeed it is! I was going to work through a book on Morse homology and then read one on Picard-Lefschetz theory! Know any good books for the latter?

I learned from this one

help

Thanks! I will check it out then. Is Volume 1 also a decent read?

Yep, just about anything with Arnold's name on it is worth a read

Perhaps I will read them both then, thanks for the suggestion!

this is gibberish

nope
msri.org/people/members/defthy07/exercises/exercisesday4.pdf

anyone know math problems with cash rewards (or something else) other than the millennium prize problems?

i've also found the Erdos problems (en.wikipedia.org/wiki/Paul_Erdős#Erd.C5.91s.27_problems) and the Hutter prize (en.wikipedia.org/wiki/Hutter_Prize)

Working on translating motivic integration into something school children can understand.

1. Working on a few rote problems in communication complexity.

2. Working to be able to use standard cryptographic primitives as a form of subliminal channel.

3. Developing a practical compression scheme that doesn't rely on stochastic assumptions.

It's not strictly mathematics (I'm a graduate student in computer science), but I'm genuinely enjoying my work right now.

Well my root-finding numerical code for a system of gap equations isn't converging, and there's literally nothing that can be extrapolated from the analytic expressions since they're in a form as open as my asshole after a night out.

On the other hand the paper relating the holonomy formalism and the principal bundle formalism of gauge theory turns out to be not what I had anticipated. The author defines the generalized holonomy map [math]H: L_x \rightarrow G[/math] as a homomorphism on path-equivalent curves [math]L_x \ni \gamma: [0,1]\rightarrow M[/math] based at [math]x[/math] to a Lie group [math]G[/math], which makes this rather distinct from the braid group approach. I don't think the connection between the braid group and Chern-Simons theory approach of particle statistics can be established just by this paper. There will still be work needed to be done in order to relate what the generalized holonomy map describes to what the braid group describes.
I never had high hopes for this project desu. If a world-renowned physicist tells you that your idea is a dead-end, he might be right.

I heard Naive Lie Theory is good... very visual using only matrices to build intuition for the big theorems. What are you using?

I'm searching for nice applications of nonstandard analysis.

Pissing of Halmos et al is a great application.

The Bernstein-Robinson theorem is of course on the top of my list.

Why are L^p spaces interesting to consider beyond L^1, L^2 and L^inf ? I mean I guess it's natural to ask about their properties since L^1, L^2 and L^inf are nicely behaved and I know that they're interesting from a functional analytic standpoint (Banach spaces, reflexive, blabla) but I have never concretely used them for anything.
Are there any problems that naturally call for L^p regularity for p other than 1,2, inf ?

Reading a book about Analysis and another about Proofs in parallel. Too bad there are no problems, just exercises.

I have a somehow dumb question about the theorem of Bolzano-Weierstrass.

It says that if, for any n, we have [math]a

Of course. If your sequence (u_n) is convergent, then each of its subsequences converges to l

Now, if you were asking if the number of different limits can be infinite, the answer is still affirmative, as you can see by taking [math](u_n)[/math] to be an enumeration of the rationals between a and b.
This sequence will have, associated to each number x in [math][a,b][/math], a subsequence converging to x.

The point is that this sequence might not be convergent

[math]u_n = (-1)^n[/math] follows the theorem : we have [math]-1.5 < u_n < 1.5[/math], and there are two l, [math]l_1, l_2[/math], that verify :

For any [math]\epsilon > 0[/math], I can find a [math]p[/math] so that there is at least one [math]n \geq p[/math] that verify [math]|u_n - l| < \epsilon[/math].

For both [math]l_1[/math] and [math]l_2[/math]

Ok, thanks. That was what I was asking.

In general [math]L^p[/math] spaces for [math]p\neq 2[/math] are uninteresting as they don't have a natural inner product structure on them.
However the Sobolev space [math]H^1 = S_{1}^{2}[/math] are useful for analyzing the stability of the solutions of equations of motion in quantum and classical field theories. The solutions [math](\psi,\psi_0)[/math] are required to lie in [math]H^1 \oplus L^2[/math] or the Cauchy problem given by the equations of motion wouldn't be well-posed and the existence wouldn't be guaranteed for arbitrary times (Strocchi 2008). This gives rise to the Swieca regularity conditions that the quantum fields need to satisfy so that your QFT is defined, which leads to the famous Haag's theorem.

I think Sobolev's embedding theorems are an example.
PDEs are not my thing, but some well-posedness theorems might require different regularity for the initial data to be applied.

trying to pass intro to analysis 1
thought i was good at calc
turns out im not lol

Thanks guys, I had forgotten about Sobolev spaces but actually it's a very good point. L^p spaces aren't really all that useful by themselves but Sobolev spaces have very strong properties

I don't think that's a good example, since H^1 is THE sobolev space and doesn't really have to do with any L^p other than p=2.
Does quantum mechanics ever deal with any other vectorspaces than hilbert spaces?
A more general space [math] W^{1,p} [/math] (once weakly differentiable and the weak derivative is in L^p)
is needed for example for PDEs, where some of the terms don't have l^2 regularity

Is it necessary to be very rigorous on proofs before reading Algebra on your own or is a vague training as EE enough ?

>Does quantum mechanics ever deal with any other vectorspaces than hilbert spaces?
Nope. The principle of quantization a la Dirac will naturally give you a Hermitian line bundle [math]B \rightarrow M[/math] over the classical symplectic phase space, the sections of which are square integrable complex-valued functions.

>Does quantum mechanics ever deal with any other vectorspaces than hilbert spaces?

Not as the "space of states".

I'm studying the properties of the Haar integral at the moment. This is pretty cool, to be honest.

If you take a bunch of lines and you connect their midpoints together, you get a new set of lines with more than you started with. Turns out the whole process is just really weird vector addition and the curve you end up with after a shitload of cycles is just a really weird derivative of the set of lines you started with, if you treat them like vectors.

Pic related

Very cool but fuck you for distracting me from studying for finals. Now I have to try and program this.

I'll give you $10 to prove I'm gay

Easy.

Try to post a new thread on Veeky Forums. If it goes through, you are now OP and therefore a faggot.

QED

What's your PayPal?

[email protected]

heh, had the same idea and did it in 3d
only problem is creating a big enough data set of starting points without it being too "regular"

Damn I don't think Mathematica's Manipulate function does 3D locators.

Fuck man you stole my idea, haha. I was going to use a 3d rendering tool I wrote to do some 3d stuff. Cool pic anyway though, if anything it's basically a spoiler.

I hope you realize that 3d plotting is a basic out-of-the-box functionality provided by both Mathematica and Matlab.

Not to discourage you or anything--small projects like these are good practice--but you don't need to reinvent the wheel unless you really need to.

Nah i was just going to plug in my program to the tool. Reinventing the wheel is a hobby of mine and it's pretty fun if you appreciate really digging into how the 'basic functionality' of many programs and tools actually works and is implemented.

Also it seems that if you calculate this continuous midpoint transformation on an open polygon with height 2A and width A, you get a cycloid.

Randomly came across this video. Thought this guy was really clear and concise.

youtube.com/watch?v=sMqI1esWpEs

I'm an undergraduate self studying out of this bad boy. None of my professors or peers do logic, so I'm completely alone.

Skipping around a bit, but I just did the proof that if a is a measurable cardinal, then a is the a'th inaccessible cardinal.

Why don't more people love this stuff? The rest of math is utter trash in comparison to logic.

>Self-studying

I'm amazed that some people manage to do this. I never ever managed to get a subject from a book alone, without the least bit of explanation from a teacher.

Fuck being dumb.

MUH DICK

I share your pain. I think it takes a certain level of stubbornness that I just don't have.

hook a nigga up?

...

Is it normal to not have the slightest fucking idea what you guys are talking about most of the time? I don't bother to look these things up anymore because I at most only ever get a superficial understanding that I forget not long after.

Am I just especially impaired or what?

1. is litteraly jus limits arithematics

>the rest of math is utter trash in comparison to logic.
With opinions like that, now we know why you are completely alone.

I am glad. :)

It's normal to not understand what majority of mathematics we talk about here. Thoguh don't let that stifle your learning in math. You should definitely be eager to learn more, despite the buzz-word-dropping that occurs in threads on Veeky Forums.

>The rest of math is utter trash in comparison to logic.
Model theory is the subject I want to study the most but don't have the time. But having bad attitudes might be related to why mathematicians don't get along with logicians.

>Is it normal to not have the slightest fucking idea what you guys are talking about most of the time?
Yes, especially when they're using categorial language.

Reading about formal logic right now, loving it. Though it might not be as technical, it has some interesting historical backgrounds and quotes.

Most people here don't know shit about shit, don't worry about it.

Studying my ass off in algebra. I'm taking graduate algebra as an undergraduate and while it's hard, the thing that's killing me is just the volume of stuff I have to know.

You can't even begin to comprehend why you don't understand shit about mathematics until you've studied abstract algebra and analysis.

Aren't those just Bezier-Curves with factor 1/2?

HELP ME DO STOCHASTIC PROCESSES ITT:

I thought Bezier curves were supposed to match the derivatives and the end points.

These are all pretty easy. Problem 3 follows from problem 2, problem 4 is just testing definitions and 5 is just conformal transofmations.
>due on April

First time on this board. You guys are a bunch of fucking nerds, I need help. What are good books to study Algebra 2 and Trigonometry.

Can you give rough outline of those courses?

Only the original guy's curve was technically correct. The others didn't continue the transformation at the end points.

But his doesn't have derivatives matching the endpoints either.

how do i make myself like other areas of mathematics
i really enjoy algebra, favorite is probably ring theory/commutative algebra
but i absolutely fucking hate analysis and analysis-related things
it's not that i'm bad at it, in fact, i'm quite good with most analysis and topology stuff
but i still hate it

can i train or trick myself to like analysis?

Not that good of a mathematician, but I really like Analysis for some reason.
I find it really intuitive, it usually combines a lot of maths domains and helpful "tricks" (like how the n-sqrt(x) is actually x^1/n, trigonometry, Taylor series,...)

Find a textbook with funny functions/series to study. In one of my tests in Analysis, we had to study the nature of series of general term 1/n, n being any integer not containing 9. I genuinely enjoyed that test.

>analysis
>intuitive

analysis is like 90% counterexamples to things that you previously thought were intuitive

This

The continuum was a mistake

I mean that even if you don't have the rigorous proofs, you can still deduce things with accuracy.

Like if I know a sequence/series converges for n -> +infinity, it will converges for 2n -> +infinity.

You just think you are good at analysis. You are not actually.

>i'm quite good with most analysis and topology stuff
>but i still hate it
You make topology cry. Why don't you like it? Maybe you would like algebraic topology.

>About to get my ass kicked by Functional Analysis because I can't remember basic definitions

How do you guys do it? I'm finding it harder and harder to focus these days

I haven't studied motives much. Is it just a port of fibre integration into some category of motives?

currently writing a paper on orthogonal art galleries
>the whole art gallery problem thing

Checked. I rate your repeating digits 3/10 because your post was boring. Better luck next time!

Writing an introductory section of my thesis on quantum walks. Currently cherrypicking terminology from Markov chains and quantum probability to give background for my arguments.

This is the kind of stuff I'm building up to:
youtube.com/watch?v=zzASv4G9bNA

That's when they teach it to you, to teach you caution. But after that you can go back to working with your intuition, keeping in mind the fact that that intuition can deceive you.

Be curious and try to think of real-life phenomena that you can model with math or try to find proofs of interesting results that use analytic tools (if you like number theory or geometry, it shouldn't be hard to find interesting results that require analysis).
Also, as a general rule, don't say you "don't like" areas of mathematics until you're at least in grad school. You are going to need everything you are currently learning, and then some, so you can't afford to be picky about what you learn at the moment.

Be curious and try to think of real-life phenomena that you can model with math or try to find proofs of interesting results that use analytic tools (if you like number theory or geometry, it shouldn't be hard to find interesting results that require analysis).
Also, as a general rule, don't say you "don't like" areas of mathematics until you're at least in grad school. You are barely scraping the surface of each topic and you are going to need everything you are currently learning, and then some, so you can't afford to be picky about what you learn at the moment. It's like a grade schooler saying they like to add but not multiply.

Which of the basic calculus texts cover Big O and Little O notation? Last week we coverd this in a lecture about infinitesimally small limits, but standart texts like Spivak don't cover it.

What is the hardest part of Calc II?
How can I make it easier?

Oh I see what was going on. You have to include the endpoints in the new list.

I thought it was already solved for orthogonal galleries. Did I mistake it for a more concrete case or what?

Also I tried doing something other than the midpoint (replacing "Mean" with "((1 - t) #1 + t #2) & @@ # &") and this is what it looks like as t goes from 0.01 to 0.95.

Hosted on pomf because slightly too big for Veeky Forums:
a.pomf.cat/jaxtzr.gif

There's a weird optical illusion that makes it look like the points jump at the end when they loop around but I've compared the two frames and they're in the same locations.

Really not that much to cover about it, once you know the definitions.

The points along the curve seem to follow the same trajectory when adjusting the interpolation distance as they would when adjusting how many iterations are done.

Compare:
a.pomf.cat/xmtveg.gif
a.pomf.cat/qyzslx.gif

Okay I'm done.

>favorite is probably ring theory/commutative algebra
>but i absolutely fucking hate analysis and analysis-related things


Try combining them.

anagrams-seminar.github.io/hdr/kahler.pdf

ihes.fr/~celliott/D_modules.pdf

math.harvard.edu/~gaitsgde/grad_2009/

etc.

I've been studying for my diff eq final. Series solutions to ordinary points and regular points are fucking annoying. It's not a hard concept, but I can't find a source that explains it clearly.

This . Most algebraists think they're hot shit for proving equalities but they have no idea how complicated the objects analysts deal with are.

From my research, a maximum boundary is proven but the minimum question still remains.
I'm a 2nd year undergrad so it's really just a research and recite kind of thing

It looks like deviating the 'interpolation' from .5 just reduces the effectiveness of the continuous transformation.