If entropy always increases then why do things naturally move to the most stable energy level...

If entropy always increases then why do things naturally move to the most stable energy level? The most stable energy level does not always mean the highest entropy such as how atoms of chlorine will eventually react with atoms of hydrogen to form hydrogen chloride. This hydrogen chloride formed has less entropy than the free hydrogen and chlorine atoms. Another example would be self-organizing units such as crystals. When liquid water loses enough thermal energy through IR emissions, the water molecules will then self-arrange into a hexagonal pattern and thus lowering the entropy. How can this be???

Other urls found in this thread:

youtube.com/watch?v=vX_WLrcgikc
twitter.com/SFWRedditVideos

That's not the definition of entropy. You can not give anyone a physical example of entropy in the world. You are not smart.

Entropy is an increase in information, so things can naturally be high entropy and energy stable. Do you understand what an increase in information means here?

If you isolated a bunch of pure hydrogen and pure chlorine in a space together, what would eventually happen?

You're confusing a bunch of related concepts. Matter will always seek to be in the lowest energy state possible at a given set of conditions. Lowest energy state takes into account both enthalpy and entropy.

Also increase in entropy happens on a full system scale. Take your example of water freezing. The water freezes and forms ice. In doing so, heat is released to the surroundings. You have to consider both the entropy change of the water freezing and the entropy change of the surroundings gaining heat from the water over the process of freezing.

Look up the definition of ENTROPY. You are confused and you are just making things up.

No, I don't think I'm confused. I think you just came across something new to you.

>I just came across a wikipedia definition so anyone who talks to me about the topic in a non-obvious way that doesn't instantly relate to that definition is wrong and confused

your kind is the cancer of Veeky Forums honestly

Bruh information entropy isn't relevant to a chemical reaction.

If I post the dictionary definition of ENTROPY this thread will die.

>ENTROPY
>ENTROPY
Are you fucking retarded

They are one and the same, just phrased differently.

It's an acronym
>Easy
>Noodles
>Too
>Risky,
>Only
>Penis.
>Yes?

>Are you fucking retatded

No

I think you're misunderstanding what informational entropy is. In reference to physical systems, the "informational" nature of entropy relates to the information needed to specify the exact state of the system, which is just a way of abstracting from the statistical definition of entropy.

Your post:
>Entropy is an increase in information, so things can naturally be high entropy and energy stable.

Doesn't really relate to what OP is talking about. Per OP's example, HCl will have lower entropy than the H2 and Cl2 reactants used. But you also need to consider the entropy increase caused by the exothermic nature of the reaction. The heat released to the surroundings of the system creates a greater entropy increase than the decrease seen in the reaction.

Heat increases information of state, and that is not me. State information is not an abstraction, god damn it.

I have read and understood. Please explain "atomic/chemical entropy" . Im listening.

It's unnecessary. If you want to go down the autist hole you can go full Landauer and relate kTln2 amounts of heat to bits of randomness in your system there's no reason to do that when the statistical framework exists.

I'm talking about classical, statistical thermodynamic entropy. The information theory side of entropy isn't necessary to discuss physical systems and doesn't play any part in answering OP's question. Not to mention the relationship between information entropy and thermodynamic entropy is still up for debate.

>It's unnecessary...

Awwww, back to Websters

Things end up in stable states because that's the definition of stable. Those are the only states that you'd expect to persist.

In nature, stable states don't have readily accessible energy so can't do work.

Note that high entropy resembles uniformity. At a macroscale nothing seems to be moving because the net movement is zero.

In picture: Low entropy.

> I'm talking about classical, statistical thermodynamic entropy.

Are you?

>High entropy resembles uniformity
It resembles non-uniformity

Hexagonal

Hexagonal

I don't know what this is. Is this what non-autism looks like?

youtube.com/watch?v=vX_WLrcgikc

the world is a system that has chaotic behaviour.
entropy is a mere consequence of the aforementioned chaotic behavior

entropy is not something you can bullshit about with stoner science

By lowering the energy, you produce heat. The more heat you produce, the more microscopic configurations you have. Higher number of microscopic configuration means higher entropy.

Entropy doesn't cause particles to move to the most stable energy level. But the particle's tendency to move to the most stable energy level is observable as increase in entropy.

I'm so glad I found a sign of intelligence on the internet today.

You made a mistake in your reasoning. To take account of the entropy, you have to consider the whole system.

Let take an example:
>This hydrogen chloride formed has less entropy than the free hydrogen and chlorine atoms

Yes, HCl have less entropy than H and Cl. However, the formation of HCl will release a larger amount of energy, which increase the entropy.
More precisely: the formation of HCl will release heat, which will increase the movement of the surrounding molecules. In the end, the whole system will have more entropy.

I hope I was clear, sorry for my english.

agreed. the error was in only considering the isolated system (as typically done with energy), but with entropy you must also include the environment - entropy as a whole (the universe?) is always increasing. Alternatively the total energy of the universe is constant.

Those 2 states have the same entropy.

kek

Negative entropy is information my dude