You can set off a stick of dynamite. But you can’t put it back together again. The amount of disorder in the universe always increases.
Well, technically you could take all of the gasses etc produced in the explosion, refine them, process them, and after a lot of work end up with a stick of dynamite again. But by introducing a lot of order to that random swirl of gasses, you will introduce even more disorder in the rest of the universe – e.g. you’re burning stuff to generate electricity to power the machines, you likely have used pure chemicals as part of your reactions and now you have a load of mixed chemical waste, etc. The amount of disorder in the universe always increases.
The “disorder” in this example is technically called “entropy”.
Entropy is NOT disorder, it’s often called that because entropy LEADS to disorder but they are not the same thing. Entropy is simply a statistic that arises because energy is more likely to spread out than to concentrate into a specific state.
For example, imagine you have 100 cups and a pitcher full of water. Now imagine you put drops of that water into the cups at random. What are the chances that *all* of the water ends up in 1 or 2 cups? It’s very slim, definitely possible, but very unlikely, what is more likely is that the water gets spread out between the cups evenly (relatively speaking of course, some cups will have more water than others). Each different way of filling the cups, for example, all the water in 1 cup, the water spread perfectly evenly between the cups, and anything in between, is called a microstate. Some microstates are more likely to occur than others (such as the water spread out evenly is more likely than the water in one cup).
Entropy is essentially just a way to measure which state is more likely. If a state is more likely its said to have high entropy. And that’s why entropy is said to always increase in a closed system, because the system will always evolve to a state that’s more likely.
Now in physics usually when people talk about enteopy they’re talking about energy. So instead of the water used in the previous analogy, it’s energy that gets distributed across the system, and instead of cups, it’s atoms and molecules and other particles and waves where the energy gets distributed in. Now one thing to remember is that there’s nothing in physics that says entropy HAS to increase, it’s just that entropy is extremely (and I can’t stress the extremely enough) likely to increase.
Edit: One example my professor gave that really resonated with me during the talk of entropy is that there is absolutely nothing in physics that is stopping all the air in the room you’re in from suddenly moving to one side and suffocating everyone in the other side. The only thing that keeps the air from doing that is probability. Its just incredibly unlikely that all the trillions of air molecules which have velocities that are more or less random would all randomly start to move in the same direction to the other side of the room, it’s much more likely that they all spread out relatively evenly in the room. That is entropy.
The most intuitive way to think of entropy is to consider it as a measure of how disordered a system is.
For example, consider a fresh pack of cards. It’s in one order and one order only, and it’s very easy to tell when the deck is no longer in that order. The fresh pack has very low entropy, because there’s only the one arrangement it can be in.
Now, if you shuffle the deck, so that the cards are completely randomized, you’ve raised the entropy of the deck. You can rearrange the individual cards very freely without damaging your ability to say “Yes, that’s a shuffled pack of cards.”
Another way to think of entropy is the ability to pull useful work out of a system. For example, you need a temperature difference to do any work with a heat engine; if there’s no gradient, nothing’s going to want to move from point A to point B. You need *low entropy*, a condition of order and being able to say “This is different than that,” in order to perform work.
One way to think about entropy is from statistical point of view. This basically says that if there are more ways of arranging things which result in the same outcome, the entropy is higher. So for example you have 10 balls, 5 identical black and 5 identical white. You arrange them in a row. There are exactly 2 ways to arrange them in an alternating way. Also only 2 ways of having 5 of the same color touching. But let’s say I want 5 black balls always grouped and the white balls wherever. There are exactly 5 ways to arrange that, therefore the state of “5 black balls touching” has higher entropy than “balls alternating”. And so on.
Another example is 2 die system. When you roll them, you can achieve combined result of 7 in many ways, while for results of 2 and 12 there is only one way for each of them.
In a physical world, imagine a perfect crystal. So the atoms are ideally arranged where they should be. There is only 1 way of doing that therefore the lowest entropy. Nature does not like low entropy, so in real world this is hard or impossible to achieve, depending on the scale. Therefore, there are always some defects to that structure which introduce some entropy, which in turn lowers the overall energy of the system. What I am trying to describe here is more or less thermodynamic potential, called Gibbs free energy if you want more reading.
All information in the universe becoming less orderly over time, carbon reacts, atoms decay etc , in early universe was all the same type of matter and now… Complex. Very difficult if not impossible to put it back together the way it was