I’m not sure how to answer exactly, so here’s a brain dump of my understanding of entropy
Entropy is basically the tendency of energy to equalize into a lower energy state, converting a portion of the difference into heat, a portion of which escapes into the universe. It’s a statistical thing - it’s a general tendency that becomes predictable at the huge number of particles involved in anything near the human scale. Like compressed air - every atom is moving at a certain speed in a random direction (we model this with temperature)
If you pop a balloon, the air rushes out because there’s more stuff to bounce off in the area of the balloon being popped, and less in the less dense surroundings. So, on average, the air bounces outwards, and the pressure (another model describing density + kinetic energy) equalizes.
Now, if you have liquid air that is being actively cooled, air molecules bouncing in are going to transfer energy into the liquid, and be captured. The liquid is also going to heat up a bit, and the hotter it is, the more molecules are going to fly out. We usually model this with temperature + pressure, but gravity plays a big role too.
So normally, entropy likes to average things out and prefers randomness - it applies to all sorts of things, like how potential chemical energy likes to be released into kinetic energy over time.
But then look at the Earth - we have pressure waves in the air constantly. We say that’s because energy is being added to the system via heat from the sun, and it can even create these systems that turn these pressure waves into vortexes that can make ice in a hot place
And then you look at stars - diffusion finally clicked for me after I sat in on a physics class explaining pulsars. They pulse out through this random diffusion, then pulse in due to gravity pulling them back in.
Then we can look even further - stars pour out energy through fusion, and scatter themselves far and wide, seeding the next generation of stars. We thought that was just the initial energy of the big bang converting to lower energy states, but then you have dark matter and energy that we invented to explain the gap in the models… Now we think maybe the laws of physics might be less universal than we thought, or maybe higher dimensions are interacting with the universe
Entropy is just another model - things generally transition to lower energy states, and convert their energy to heat… But there’s endless cycles that do the opposite. Entropy is a pretty compelling tendency in a closed system, but those don’t actually exist - it could be that there’re larger and larger cycles that oscillate between local entropy and the generation of local regions of higher energy
Entropy doesn’t disappear if we can nail it down the subatomic - it’s just statistical behavior of. It might disappear if we go the other direction - what if every black hole spawns a new universe? Can you just go down the rabbit hole infinitely, creating smaller and smaller energy differentials through new universes? Maybe if we get deeper into quantum mechanics we’ll find that infrared energy spontaneously transitions into hydrogen, which forms into new stars, keeping the cycle going forever
Entropy is a very useful model though, maybe it disappears over large enough scales, but ultimately it most certainly exists on a local level - complex, dynamic things will break down to form simpler things, and energy temporarily reverses this process, but in doing so a portion is converted to heat, and a differential is required to turn heat energy into something more complex like electricity or chemical energy
So practically, I’d say the answer to your question is no, entropy is a very useful model regardless of what more we might learn, but in a truer sense who knows? We don’t understand physics nearly as well as we think.
I think there’s a new wave of physics that will break a lot of our assumptions over the coming decades - we’re finding more and more gaps in our models, which is a very exciting thing
Natanael@slrpnk.net 1 year ago
The trick is that the more closely you model things the more energy you need to expend to compute the model, and a computer can not perfectly model itself, so therefore you still increase unmodeled unknown entropy somewhere even if you have one closed system carefully controlled