Wednesday, June 06, 2007

Entropy Production and the Best of All Worlds

[I'll get back to my usual schtick soon enough. If you're interested, I tried to keep it accessible.]

Chemical engineering has historically been a phenomenological science, meaning that it's more based on observed phenomena than it is built bottom-up from the sub-microscopic nuts and bolts of the universe. If you were to, oh I don't know, go to graduate school in the field, you'd find a good third of the curriculum devoted to Transport Phenomena, which describe the rate at which thermodynamic variables--momentum, thermal energy, chemical potential (and charge too, though it's not usually lumped in there)--dissipate along gradients. Generalized equations of change can be set up describing how these quantities are conserved, how a system will organize itself under given boundary and initial conditions, how fast it will evolve. Classic and indispensible expressions fall out of these basic equations of change: your Maxwell's, your Navier-Stokes, your Fick's. External forces are allowed, and the variables are all coupled (temperature differences can cause mass flow, electrical currents cause heat, etc.), and things get hairy in a hurry for all but the basic systems. It's like they say about the weather, you can't possibly even define the conditions and variables well enough to even think about solving it.

Thermodynamics, another third of the core curriculum, usually discusses equilibrium, the end state of things. Thermo tells you the boundaries, what you can get if you squeeze this or bias that, how much energy is in that gradient, how much of it you can actually use. Thermo is like the finger-pointing nanny laying down the boundaries. Transport phenomena are clever children figuring out how to get away with shit inside of them. The equations of change have the first two laws of thermodynamics built in, but the "how" of them comes from recognizing which terms are important in a given situation, how various coefficients weight the importance of one dissipation mechanism over another.

You can balance any state variable really, and classical thermodynamics tells us that entropy is one such state variable. You can derive a rate of entropy production from these balance equations too. And if you start to look beyond phenomenology, entropy probably has the most generic definition of all of them. Statistical thermodynamics quantitatively links entropy to the number of available states that a thing can occupy. And the thing can be anything really: an atom, a person, a datum...once you get into informational theories of entropy, you've deviated far from being an engineer. Zooming into the molecular world of statistical thermo, it grows apparent how molecular ensembles can dictate macroscopic properties, and it can be useful in explaining and predicting phenomenological effects. Small systems such as cell structures or micromechanical devices or surfaces or local chemical environments can be thought to occupy some middle ground. [And really, things have occupied middle grounds for hundreds of years--the microscopic aspects of chemical and electrical processes have always been essential to their understanding and application.]

The second law of thermodynamics states that entropy of a closed system increases toward a maximum. When you start talking about the rate of entropy production, how fast it increases, you've entered the field of non-equilibrium thermodynamics. For steady systems, near equilibrium, there are various relationships suggested by non-equilibrium thermo that become useful. At the statistical mechanical level, entropy production must be considered probabilistic, and some weird stuff indeed arises, when entropy decreases for individual events in the ensemble.

Take that magic camera now and zoom back out, way out...farther...OK. Just like phenomenology has a habit of washing out microscopic events, enormous systems-of-systems can be approachable if you wash out some of the details of the usual phenomenology. Interestingly enough, non-equilibrium thermo appears to become useful again at this distant level. I've read some articles recently on the hypothesis of maximum entropy production for complex systems such as weather systems, ecosystems, or economies. The hypothesis appears to be completely baseless, but it has nonetheless been successful in describing some complex phenomena, and suggests that the recurrence of certain structures in complex systems is because they dissipate the great big energy gradients most efficiently. I hate the examples that these authors have used: phenomenological models describe waves and vortices and so on well enough, but to suggest that they can be interpreted as optimum dissipative structures is, well, interesting.

The question arises: is human evolution (genetic or social, take your pick) merely the most effective path that has so far developed to dissipate that huge solar energy gradient? You can see the danger of drawing out the cranks here, but just because speculation is silly and intellectually dangerous doesn't mean that it can't be fun.

People aren't at equilibrium (certainly not when they're alive), and neither is society. With the constant application of solar energy, we maintain at a roughly steady-state non-equilibrium condition. You can irresponsibly ask thermodynamics the same questions that philosophers have struggled with for millenia. Is human nature a development that facilitiates entropy production (with occasional fluctuations toward entropy absorption that wash out in the time average), and does human organization facilitate it? Is the nature of our ensemble developing over time, and do we waste heat better if we are brutal or if we are constructive? If we're happy or sad? Can it get better than this? Has it ever been, really?

Keifus

Some public access reading, if you just can't get enough:

  • Maximum entropy production vs. Darwinism (science reporting, easy to read)
  • Entropy production and life as we know it (accessible, but I found it tedious)
  • Hey man, quantum mechanics isn't dissipative. Where's the entropy? (big words and a little math, read 1/3 of it and didn't seem bad though.)
  • Fluctuation theory for small systems (from Physics Today)

  • 7 comments:

    Artemesia said...

    Keifus..
    What a great post this is!
    I will be taking notes. Your definitions and connections have opened many different ways of looking at what we call the natural world..ourselves in the midst of it ticking away.

    Especially those of us who live in closed systems (LOL). ..And those who want to preserve closed, insulated systems are the ones who create the suicide bombers..change will (blast)out!

    On NOVA last night, an interesting fact. Comets often pass by, through, near our solar system. If it wasn't for Jupiter, our gassy, jolly giant..we would have been (earth) out of business many times over. Jupiter acts as a gravitational puller in to these comets. It stops them in their tracks with its powerful gravitational field. Amazing..
    A

    Keifus said...

    Hmmm, a closed system like... (I have some probability of feeling bad if I open it.)

    Glad you liked it. I'm not sure how convinced I am that the nonequilibrium arguments are particularly useful for those large systems, but it's definitely an interesting way of looking at things. (It's probably obvious that I'm more grounded in classical thermo approaches.)

    K (I don't have good journal resources at work...it's so wrong.)

    LentenStuffe said...

    Surely you know I'm too thick to understand any of this?

    C.P.Snow, eat your heart out ... the two cultures have met.

    hipparchia said...

    i knew this was gonna turn out to be all my fault.

    gzpfulxz: going to the zoo to pack up some fucking leptons

    Keifus said...

    Hey John, I don't know if I'd be overly impressed. Sometimes this stuff hard to understand only because the presentation is baffling. And I'm all over the map on that one.

    I've often thought that engineering and science types get a bad rap for their writing skills. On the other hand, as I read more bits of poetry (thanks in part to you) and classics here and there, that humanities world looks farther away than ever.

    K (Fucking leptons! Better them than the bozons.)

    Archaeopteryx said...

    Mixing philosophy and physics makes my head hurt.

    Keifus said...

    Yeah, talk about your unholy alliances.