What is entropy?

What is entropy?

Other urls found in this thread:

youtube.com/watch?v=vX_WLrcgikc
twitter.com/SFWRedditGifs

A measure of the distribution of energy amongst microstates.

[math]S = k_B \log W[/math]

Its the amount of information a system contains

My favorite word, I actually spent my sophomore year of highschool trying to remember it. I didn't know the definition well enough to be able to look it up or accurately describe it to anyone. Then when I got halfway through physics my Junior year it got mentioned and I wrote it down in so many other places so I wouldn't forget it again.

Tl;dr entropy is my favorite word

nice.

Disorder

Messy-ness

Basically the anti-autist

it's the amount of disorder in a system. so for example a scrambled egg has higher entropy than an unscrambled one. furthermore, godel's incompleteness theorem implies that entropy can never decrease.

unless you break it, like i'm gonna

Never could really wrap my head around entropy

It seems like some sort of arbitrary explanation scientists came up with and its "true" but it's not really saying much

Ahh yes, chaos can never go down
So is the universe headed toward total chaos with absolutely no order among particles?

Can you explain how Gödel incompleteness implies disorder can never decrease? Genuinely curious, as I didn't quite catch that. But I'm pretty poor at math though.

the psuedo force that drives ordered systems into disorder, simply because there are more possible states that a disordered system can occupy by tens of orders of magnitude.

the expected value of the negative logarithm of the probability mass function of a random variable

The inevitable return to the uniform.

Statistics

[math]S=dq/dT[/math]

>godel's incompleteness theorem implies that entropy can never decrease
lol?

just think about his example. can you put back an unscrambled egg? nope. once entropy starts (and it's ALWAYS happening even as we speak in all systems) it never ends. the only thing you can do is to try and lower it, but you'll never eliminate it because it's a fundamental part of our simulated reality.

>simulated reality
uh-huh

Entropy is not disorder

youtube.com/watch?v=vX_WLrcgikc

The expectation value E[I] of of the information
I(x) := -log(p(x))

It's not quite disorder. One way I like to think of it is as propensity to disorder; entropy is the number of thermodynamically accessible microstates of a system.

Let's take the classic example of the ball in a box, from your stat mech 101 class (most schools call if "Statistical Physics", more or less). If one box has three compartments and the other box has four compartments (assuming each compartment can hold the ball), the number of accessible "microstates" of the "system" with four compartments is greater than that of the system with three compartments. The lack or order (for lack of better terminology) is the same -- each box has one ball, in one compartment.

how does entropy relate to enthalpy?

lol

>take a big bowl
>take 10 apples and 10 oranges
>throw them all into the bowl
>mix
Are you saying I can't sort the items from the bowl into two distinct piles of different fruit anymore?

>can you put back an unscrambled egg?
yes

this.

The work you must put in a system to correct disorder

its some shit that physicists made up because they were butthurt about the laws of thermodynamics being empirical.