Friday, March 22, 2024

Entropy

Entropy (pronounced en-truh-pee)

(1) In thermodynamics,  the capacity factor for thermal energy that is hidden with respect to temperature; an expression of the dispersal of energy; a measure of the energy is spread out in a process, or how widely spread out it becomes, at a specific temperature.

(2) In thermodynamics (on a macroscopic scale), a function of thermodynamic variables, as temperature, pressure, or composition, that is a measure of the energy not available for work during a thermodynamic process (a closed system evolves toward a state of maximum entropy).

(3) In statistical mechanics, a measure of the randomness of the microscopic constituents of a thermodynamic system (symbol=S).  Technically, a statistical measure of the disorder of a closed system expressed by S = k log P + c where P is the probability that a particular state of the system exists, k is the Boltzmann constant, and c is another constant).  Expressed as joules per kelvin, it's essentially a measure of the information and noise present in a signal.

(4) In data transmission and information theory, an expression of specific efficiency, a measure of the loss of information in a transmitted signal or message.

(5) In cosmology, a hypothetical tendency for the universe to attain a state of maximum homogeneity in which all matter is at a uniform temperature (heat death).

(6) In political science, a doctrine of inevitable social decline and degeneration; the tendency of a system that is left to itself to descend into chaos (this definition widely used literally and figuratively in many fields.

(7) In modeling theory and applied modeling, a lack of pattern or organization; a state of marked disorder; a measure of the disorder present in a system.

1867: From the German Entropie, coined in 1865 by German physicist and mathematician Rudolph Clausius (1822–1888) by analogy with Energie (energy), replacing the root of Ancient Greek ργον (érgon) (work) by the Ancient Greek τροπή (trop) (transformation).  The Ancient Greek ντροπία (entropía) (a turning towards) is from energie, the construct being en (in) + trope (a turning, a transformation) from the primitive Indo-European trep (to turn).  Rudolph Clausius had for years been working on his theories before he coined the word Entropie to describe what he had been calling "the transformational content of the body."  The new word encapsulated the second law of thermodynamics as "the entropy of the universe tends toward a maximum" but Clausius thought the concept better illustrated by the mysterious disgregation (an series of equations explaining dissolution at the particle level), another of his coinings which never caught on in the same way.  Entropy & entropology are nouns, entropic is an adjective and entropically is an adverb; the noun plural is entropes.  The synonym entropia is an internationalism rarely used in English.

Entropy describes uncertainty or disorder in a system and, in casual use, refers to degradation or disorder in any situation, or to chaos, disorganization, or randomness in general.  In a technical sense, it is the gradual breakdown of energy and matter in the universe and is an important part of several theories which postulate how the universe will end.  The laws of thermodynamics describe the relationships between thermal energy, or heat, and other forms of energy, and how energy affects matter.  The First Law of Thermodynamics states that energy cannot be created or destroyed; the total quantity of energy in the universe stays the same. The Second Law of Thermodynamics is about the quality of energy.  It states that as energy is transferred or transformed, more and more of it is wasted. The second law also states there is a natural tendency of any isolated system to degenerate into a more disordered state; at a microscopic level, if a system is isolated, any natural process in that system progresses in the direction of increasing disorder, or entropy, of the system.  The second law also predicts the end of the universe, implying the universe will end when everything becomes the same temperature. This is the ultimate level of entropy; if everything is the same temperature, nothing can happen and energy can manifest only as the random motion of atoms and molecules.  Time would stop immediately after the point at which, for the first time since the point at which the big bang happened, everything was happening at the same time.

Lindsay Lohan and her lawyer in court, Los Angeles, December 2011.

The term entropology is a portmanteau word (the construct of the blend being entrop(y) + (anthrop)ology) which was 1955 coined by the French anthropologist Claude Lévi-Strauss (1908–2009) whose theories and models even today continue to underpin some of the framework of structural anthropology, the debt to him acknowledged by structuralists in many fields and apart from all else, in the social sciences, words like entropology are much admired.  It first appeared in his book Tristes Tropiques (Sad Tropics (1955)) a text itself structurally interesting, being in part travelogue, research paper and memoir, interspersed with philosophical musing on music, literature, history, architecture and sociology; these days it’d be called post-modern.  The essence of entropology is that the transformative path of human cultures (the sometimes separate, sometime parallel notion of “civilization” seemed not to trouble Lévi-Strauss) is inherently corrosive & disruptive.  It seemed a grim thesis but it must be admitted that by 1955, there was plenty of evidence to support his view.

A probably inaccurate representation of nothing.

The idea of nothing, in a universal sense in which literally nothing (energy, matter, space or time) exists is difficult to imagine, imaginable presumably only as infinite blackness although even that would seem to imply the spatial.  That nothingness is perhaps impossible to imagine or visualize doesn’t however prove it’s impossible but the mere fact matter, energy and time now exist in space does imply that because, were there ever nothing, it’s a challenge to explain how anything could have, from nothing, come into existence.  Despite that, it would be interesting if cosmologists could attempt to describe the mathematics of a model which would describe what conditions would have to prevail in order for there truly to be nothing.  That may or may not be possible but might be an interesting basis from which to work for those trying to explain things like dark matter & dark energy, either or both of which also may or may not exist.  Working with the existing universe seems not to be helpful in developing theories about the nature of all this supposedly missing (or invisible) matter and energy whereas were one, instead of working backwards as it were, instead to start with nothing and then work out how to add what seems to be missing (while remaining still not visible), the result might be interesting.

No comments:

Post a Comment