What is a precise definition of entropy?

What is a precise definition of entropy?
What is a precise definition of information?
Why does entropy increase with time?
Why cant we remember the future?

well?

why does Google exist ?

>What is a precise definition of entropy?

It's a measure of disorder. The higher the entropy, the more disorder in your system. Suppose you have an ideal crystal that can be displayed with three base vectors and one translation vector. You could scale up the crystal as much as you want in an ideal crystal, all points or "nodes" have always the same distance to each other. Here entropy is zero. Now go into the real world, where the positions of those nodes vary: you have "chaos", entropy increases.

>What is a precise definition of information?

Depends. CS people will say something different than physicists. Usually information is a coordinate. Where is a point? How does it move? Everything is essentially a problem of geometry.

>Why does entropy increase with time?

Suppose you have a closed system with two different gases. Both different gases have different temperatures. According to the 0th axiom of thermodynamics those gases will try to get into a thermodynamic equilibrium, meaning both must have the exact same temperature or energy. In a closed system this can never happen and particles will get more and more disordered as time passes.

>Why cant we remember the future?

?

>?
whats not to understand?
physical laws are symmetric w.r.t. time, yet we can only remember the past, not the future.

>disorder

It's a popsci analogy. Disorder is subjective.

Even Wikipedia has a non-popsci definition.

>It's a popsci analogy. Disorder is subjective.

What? No it isn't. If we go back to the crystal, disorder occurs when your translation vector becomes a function T(x,y,z,t), when it isn't constant. Even wikipedia says it IS disorder. I'm no chemist and my knowledge about this comes from one mat sci semester, but mathematically it is easy to explain. I will admit though, that I do not know yet how to quantify disorder.

Time really is no trivial concept. It begins with our notion of time that is recursively defined by the bodies around us.

>Time really is no trivial concept
youre a weirdo, go away

a predestined future doesn't necessarily mean you must already be aware of it, memories are written in the present for use in the future when recalling the past, to say we're conscious of the future would be like saying your book is finished the moment you write the first letter.

All of the shit you said makes no sense, and is probably the worst explanation I've seen

You're smart and clearly looked into it so total props but your wrong on time and over simplistic on entropy. OPs questions set it up well, in that entropy is best understood as a measure of information within a system.

Just calculate disorder as the reciprocal of order.

Can you explain what is wrong with my notion of time? In the past it was defined as certain intervals of celestial objects and fractions of these intervals (hours, minutes, seconds etc.). Now we count the number of times some atom spins. That's what I mean. Doing this is no explanation of time.

lost it at "why cant we remember the future?''
Thanks OP, you just made my hour

it's a legitimate question you rancid fucktard. protip: physicists dont actually know why there is this apparent asymmetry of in our day-to-day experience, when physical laws are perfectly symmetrical (that is: reverse the movie of a bowl of cereal shattering, and the result will still be a solution to the equations that describe our universe. The state of the assembled bowl is just unlikely, so the question arises why we started out with such an unlikely state)
I believe the current working hypothesis is that for some magical (unknown) reason the universe was in a state of very very low entropy in the past.

But from what I understand there's nothing in the laws of physics that would prevent a creature that could remember the future

Something like the log of this configuration - the log of possible configurations or so.
Can't remember
Whatever.

>what is entropy
Depends, are we referring to entropy in terms of quantum mechanics, information, or thermodynamics. However a good overall definition is the number of microscopic configurations in any given system corresponding to a given state.

>what is information.
Again depends who you ask. I myself like the definition of 'a quantity represented by a certain sequence of things'. But again in depends on the context.

>why does entropy increase over time
It doesn't always increase over time. Just in isolated systems.

>why can't we remember the future
I'm no expert in human physiology but it could be something to do with our brain. If it isn't capable of remembering the future we wouldn't be able to. Just an idea

>Entropy
shit wants to spread out and different kinds of shit want to be mixed.

basically the universe is trying to diffuse itself.

>What is a precise definition of entropy?
The entropy S of a macrostate is defined by [math]S=k \ln (\Omega)[/math],
where [math]\Omega[/math] is the number of microstates that belong to a macrostate.

>Why does entropy increase with time?
It's by far more probable than the option of decreasing with time.

I remember the future.

The future isn't what it used to be

Entropy decreases all the time. Put a hot bar of steel into a cup of cold water. The entropy of the water increases, but the entropy of the steel decreases. Look, I broke the second law of thermodynamics...except that I didn't. The total entropy of the system (that is both the water and the steel) increases. We need to make sure we're talking about total entropy of closed systems.

Wow I bet you feel smart.

Sometimes

>Why cant we remember the future?
Mr, Nobody