What's the most black magic shit you've seen in mathematics?

what's the most black magic shit you've seen in mathematics?

Other urls found in this thread:

youtube.com/channel/UCYO_jab_esuFRV4b17AJtAw
en.wikipedia.org/wiki/Banach–Tarski_paradox
en.wikipedia.org/wiki/Tupper's_self-referential_formula
en.wikipedia.org/wiki/Borwein_integral
en.wikipedia.org/wiki/Long_line_(topology)
youtube.com/watch?v=kbKtFN71Lfs&t=7s
en.wikipedia.org/wiki/Feigenbaum_constants
en.wikipedia.org/wiki/Logistic_map
amzn.com/0520023560
twitter.com/SFWRedditVideos

and this one too

...

I hear if you read Principia Mathematica by Russel and Whitehead aloud over a piece of graph paper a ghost will spin your pencil.

[math] \int\limits_{0}^{\pi} sin x dx = 2 [/math]

I refer specifically to the simple fact that the area under such a curve should turn out to be a natural number at all, and not yet another crazy, irrational number. And that such quantity in relation to such geometric object should be knowable.

for me its the TREE(3)

>tfw I have a hard copy of PM and also of the Simonomicon with its shit sigils on my shelf

get on my level, faggot. Though, I fear that the Reals, despite our sincere efforts to Banish them, are coming for me, and so I type in haste. I have been careful, but

...

...

...

wait what?
Isn't sin supposed to be the y coordinate of a unit circle?
if so the area underneath that y is 2.. that means that the area of a circle is 4?
Am I retarded?
I know it's pi * 1 in unit circle, but.. I don't see the error.

i realize the significance of the second one is that cool shapes form, but what is so significant about the first one

the area under [math] \sin x [/math] is not the area under a circle. The function that you're thinking is probably [math] \sqrt{1 - x^2} [/math]

>sin supposed to be the y coordinate of a unit circle?
y=1sina
x=1cosa
0

afaik it is an example of a function that is continuous everywhere but differentiable nowhere

The first one is Weierstrass's function, it was the first example of a function that was continuous everywhere but differentiable nowhere. Before that it was assumed that it was assumed that every continuous function was differentiable, except at certain isolated points.

I've always though linearity of expected value is like black magic (particularly for dependent variables).

Not the most black magic shit I've seen, but I thought it was a bit humorous.

> This goddam sequence of numbers
> nature uses it as a rate of growth

The fact that there are two 1's at the start and no 0 really gets to me, but I understand that it must be the case for any growth to continue. I believe this holds a shitload or more secrets than we realize.

youtube.com/channel/UCYO_jab_esuFRV4b17AJtAw

Watch the inscribed rectangle one and space filling curves.

Mind blown

o shiiiiiiiiet

One of my favorites:
en.wikipedia.org/wiki/Banach–Tarski_paradox

divide a sphere into four parts
rearrange them
here, two identical spheres!

...

[eqn]1^3+2^3+3^3+...+n^3=(1+2+3+...+n)^2[/eqn]

>AC
He said mathematics.

...

This is comfy.

Even better: Tupper's self-referential formula. This is cheating, though. Some would say it's ugly math, but still funny.

en.wikipedia.org/wiki/Tupper's_self-referential_formula

any undergrad can find an injection from one sphere to two spheres

it really is a very clever choice of subsets to only requires rotation

>what's the most black magic shit you've seen in mathematics?

ahem...

Even without doing the induction, there's something very simple and pleasing in expressing the pattern for myself, writing it down, then later looking it up and seeing the exact same form having been used.

In the graphic, the "halving" on the even components' top layers is an enticing idea which admits of a variation, or elaboration.

Absolutely appalling. Of what importance is this inequality?

Did you even look at the article...

I give to you a fancier version.

-1/12

absolutely none, it's just neat.

Find a book.

41 triple pendulums with very slightly different initial conditions.

How my life has been going, with t in 6-months periods.

I really liked book of proof. I learned a lot.

CA?

chaos is so neat

no

Just reading up on fractals/chaos theory right now.

Is this a cringe thread or am I plastered and on IFLS without knowing ?

And the original result follows from this by taking n=p^N for any prime p. That's neat.

I don't know how to explain it very well, tell me if it works for you please
|||||||||||||||||||||||| |
Keep changing which end your counting from and keep going until you reach the middle. Even uncount and recount a couple. Do it a few times and include/don't include the one that's off to the side, try forget if you have counted/uncounted that one or not. Okay here goes: For each one or two your counting in before you switch sides, your introducing possible mistakes okay.

Okay so once you get a discrepancy between what you think you've done and what adds up - count them in a straight line. Then repeat the entire excersise until your sure you got it wrong when you counted them in a straight line, because the total has varied that much, then count them in a straight line again

What's even crazier is that now we know that there are in fact infinitely many more "ugly" functions (continuous everywhere, diff nowhere) than nice functions

Veeky Forums is the IFLS subreddit.

en.wikipedia.org/wiki/Borwein_integral

>science demonstrates the fine line between a good hair day and a bad one

And, as expected, although is manifestly a /geometric/ observation, its demonstration may nevertheless be effected by means of a completely straightforward induction proof, whose only trick is in knowing the simple algebraic representation of the nth triangular number.

We wish to show that

[eqn] \forall n \in \mathbb{N} \; , \;\;\; \sum\limits_{k=1}^{n} k^{3} = \bigg( \sum\limits_{k=1}^{n} k \bigg)^{2} [/eqn]

As 1=1, the basis case is inspected, and the above equation is taken as inductive hypothesis. Adding the cube of n+1 to both sides and recalling the simple algebraic form of the nth triangular number gives the following equalities, where the equality of the red items in particular completes the demonstration.

[eqn] \bigg( \sum\limits_{k=1}^{n} k^{3} \bigg) + (n+1)^{3} = \color{red}{ \sum\limits_{k=1}^{n+1} k^{3} } = \bigg( \frac{n(n+1)}{2} \bigg)^{2} + (n+1)^{3} = \bigg( \frac{(n+1)(n+2)}{2} \bigg)^{2} = \color{red}{ \bigg( \sum\limits_{k=1}^{n+1} k \bigg)^{2} } [/eqn]

(x% of x)/0.01 is x^2

1+2+3+4+5+... until the infinity, is equal to -1/12.

>what's the most black magic shit you've seen in mathematics?

"Our theories about gravity and the speed of the propagation of light tell us that the universe should be a billion times more massive than it appears to be! Must be Dark Energy!"

>what's the most black magic shit you've seen in mathematics?
en.wikipedia.org/wiki/Long_line_(topology)

This
youtube.com/watch?v=kbKtFN71Lfs&t=7s

can this be related to the space of (0,1) valued functions on the reals?

the universe is an infinte fractal that uses the fibbonaci sequence as a ratio for everything.

chaos is exponential and follows the golden ratio

>5'11 v. 6'0

>the basis case is inspected
Yuck

>mathematics
>gravity
>propagation of light
wat

cholesky decomposition is pretty nifty desu as is pca

Your disgust is unwarranted. The remark although banal, is also not inelegant, and above all is a clause in an express and completely explicated demonstration.

When I went over it, it felt vaguely familiar. Then I realized that it is the second exercise for the very first section of Andrews' Number Theory (Dover).

Brouwer's fixed point theorem trumps everything else that has been posted in this thread up to this point. "Dude patterns lmao". No, the above is where stuff starts getting weird and the walls start bleeding.

fucking pic related

patricians unite

>FRACTALS INTENSIFY

>Brouwer's fixed point theorem
why? its just the generalization of the fact that any continous function on the unit interval [0,1] intersects the identity function f(x) = x at least once.

...

more like shitpost after shitpost amounts to nothing

...

golden ratio?
I've seen the Feigenbaum constants as more fundamental

en.wikipedia.org/wiki/Feigenbaum_constants

You can actually use Brouwer's fixed point theorem to prove Hutchinson's theorem, which is a part of fractal geometry.
It can also be used in game theory to prove Nash's Theorem, so it's pretty cool

R by itself can be used to encode every picture ever made (as a picture is made out of discrete pixels), and it's a much smaller set than P(R^2)

Holy Macaroni

You have a source?

whats your point?

Elliptic curves, obviously.

I KNOW DON'T REMIND ME. TELL ME YOUR SECRETS, DIELECTRIC INERTIA PLANE.

the fixed one

>tfw 5'9"

I think its a proof of that houses-and-water-lines graph theory problem's unsolvability via boiling it down to its core nature.

>those red lines on the left
WHERE HAVE I SEEN THOSE BEFORE AND WHY ARE THEY IN THE HIGHER-DIMENSIONAL MANDELBROT SET

I have a question. To me, it seems that the answer is obviously true, but I want to make sure.
Suppose I have two identical chaotic systems. Both start with nearly identical parameters, where the difference in parameters, [math]x[/math] ,is extremely small, and is measured both in absolute terms, and as a ratio of the two parameters' average. So, if one starts at 100,000 , and the other starts at 100,006 , you would measure [math]x[/math] as both 6, in absolute terms, and 3/100,003 (because the starting parameters are 100,003, give or take 3)
Then, let the systems run, and see how long it takes for them to deviate from one another by a predetermined amount, say, 1000. Also, graph the amount of deviance as a function of time, for good measure.
Given this information, is it possible to determine things such as "how long you can expect a system to run according to simulations, given the margin of error(deviance)" or "How far will a system deviate in t seconds assuming we know the starting deviance", etc.

laughed my ass off at this

it resembles a particular plot of a simple process which quickly goes all qqqwrfhwlkjwlkjfwe; chaotic.

Checking the wiki for chaos theory quickly returns the pic you and I were (almost certainly) both thinking of.

It's the logistic map, plotted as a function of r.
en.wikipedia.org/wiki/Logistic_map

Holy cow, that's exactly what I was thinking of. The resemblance is uncanny. But, why does the bifurcation diagram of the logistic map show up in the Mandelbrot set when it's extrapolated to higher dimensions? What's the fundamental, underlying parallel that ties the two together, besides just both of them being fractals?

I don't have exact answers to your questions but a highly related concept are Lyapynov exponents, which you might enjoy reading about

the bifurcation diagram represents the cross sectional map of a strange attractor. the mandelbrot must be in higher dimensions an attractor of some kind.

...

The set of sets of R2 is at least aleph 1, no fucking shit it contains all possible information. your example is literally the autistic version of the library of Babel, which is actually fascinating precisely because it is finite.

check yourself, mate

this is how memes flow. This is how culture diverges. I experimented with it in highschool. Start an inside joke. Watch to see the first person who "doesn't get it" and fucks it up. Then, count how many people make different variations of the joke. By the time you hit the end of the graph, the original joke is an abstract idea that you can base other jokes around.

"knock knock"

Is this legit coming from the Mandlebrot set alone or is it some autistic "math art" made by someone who should be thrown from a helicopter?

This is satanic as fuck

Looks like I'm going to jail for CP

When I was a CS undergrad taking a Formal Methods and Automata Theory class, our Professor showed us how to compute the nth Fibonacci number using the formula for the golden ratio. The trick was to use integer math instead of floating point math. And it worked.

I also was similarly impressed with the Gaussian series. Especially since Gauss came up with it as a child.

Mandelbrot is spoopier than life

I made a game of life in my first year of CS work

Also, I have an MSCS, but the toughest CS class I ever took was an undergraduate class on Automated Deduction. And this was one of the text books for that class:
Metalogic: An Introduction to the Metatheory of Standard First Order Logic
amzn.com/0520023560

That stuff is black magic math.