Showing posts sorted by date for query Fudge. Sort by relevance Show all posts
Showing posts sorted by date for query Fudge. Sort by relevance Show all posts

Wednesday, November 20, 2024

Pisteology

Pisteology (pronounced pi-stol-uh-jee)

(1) In theology, the branch dealing with the place and authority of faith.

(2) In philosophy, a theory or science of faith.

Circa 1870s: From the German Pisteologie, the construct being the Ancient Greek πίστις (píst(is)) (faith) + -eo- (faith) (akin to peíthein to persuade) + -logie.  The English form is thus understood as píst(is) +-e-‎ + -ology.  The Ancient Greek noun πίστις (pístis) (faith) was from the Primitive Indo-European bheydhtis, the construct being πείθω (peíthō) (I persuade) +‎ -τις (-tis); πεῖσῐς (peîsis) was the later formation.  Although in English constructions it’s used as “faith” (in the theological sense), in the original Greek it could impart (1) trust in others, (2) a belief in a higher power, (3) the state of being persuaded of something: belief, confidence, assurance, (4) trust in a commercial sense (credit worthiness), (5) faithfulness, honesty, trustworthiness, fidelity, (6) that which gives assurance: treaty, oath, guarantee, (7) means of persuasion: argument, proof and (8) that which is entrusted.  The suffix -ology was formed from -o- (as an interconsonantal vowel) +‎ -logy.  The origin in English of the -logy suffix lies with loanwords from the Ancient Greek, usually via Latin and French, where the suffix (-λογία) is an integral part of the word loaned (eg astrology from astrologia) since the sixteenth century.  French picked up -logie from the Latin -logia, from the Ancient Greek -λογία (-logía).  Within Greek, the suffix is an -ία (-ía) abstract from λόγος (lógos) (account, explanation, narrative), and that a verbal noun from λέγω (légō) (I say, speak, converse, tell a story).  In English the suffix became extraordinarily productive, used notably to form names of sciences or disciplines of study, analogous to the names traditionally borrowed from the Latin (eg astrology from astrologia; geology from geologia) and by the late eighteenth century, the practice (despite the disapproval of the pedants) extended to terms with no connection to Greek or Latin such as those building on French or German bases (eg insectology (1766) after the French insectologie; terminology (1801) after the German Terminologie).  Within a few decades of the intrusion of modern languages, combinations emerged using English terms (eg undergroundology (1820); hatology (1837)).  In this evolution, the development may be though similar to the latter-day proliferation of “-isms” (fascism; feminism et al).  The alternative spellings are pistology & pistiology.  Pisteology is a noun and pisteological is an adjective; the noun plural is pisteologies.

The early use of pisteology was in the context of theology and it appears in an 1880 essay on the matter of faith by the Congregational minister Alfred Cave (1847–1900).  The Oxford English Dictionary (OED) refers to the word as exclusively theological but in later editions noted it was also used to mean “a theory or science of faith”, reflecting its adoption in academic philosophy although the embrace must have been tentative because pisteology was (and remains) “rare”, listed as such by those lexicographers who give it a mention though what is clear is that it seems never to have been cross-cultural, remaining implicitly a thing of Christendom.  In a sense, it’s surprising it hasn’t appeared more, especially in the troubled twentieth century when matters of “faith and doubt” were questioned and explored in a flurry of published works.  Perhaps it was a division of academic responsibility, the devoted studying belief and the scholars the institution, the pragmatic settling for the Vatican’s (unofficial) fudge: “You don’t have to believe it but you must accept it.”

Pondering cross-cultural pisteology: Lindsay Lohan carrying the Holy Qur'an (Koran), Brooklyn, New York, May 2015.

While clearly the universities got involved and the intersection between pisteology epistemology (the study of knowledge and belief) does seem obvious to the point when the former might be thought a fork of the latter, its roots and concerns remained theological and Christian, exploring how faith functions in religious traditions, doctrines, and human understanding of the divine and many famous thinkers have written works which may be thought pisteological landmarks.  Saint Augustine of Hippo (354–430) wrote so widely it’s probably possible to find something which tracks the path of some direction in Christianity but underling it all was his famous admission: “I believe in order to understand”, more than a subtle hint that faith is a prerequisite for true comprehension of divine truth.  Saint Thomas Aquinas (1225–1274) lived 800-odd year later and was better acquainted with the philosophers of the Classical age.  Aquinas is sometimes said to have “integrated” Aristotelian philosophy with Christian theology and while this is misleading, he understood the spirit of reasoning from Antiquity was compelling and in a way that’s influential still, he argued faith and reason complement each other, defined faith as a virtue by which the intellect assents to divine truth under the influence of the will.  A central figure in Reformed theology, John Calvin (1509-1564) explored faith extensively in his Institutes of the Christian Religion. He described faith as a firm and certain knowledge of God's benevolence toward us, founded on the promise of the gospel and revealed by the Holy Spirit.  Martin Luther (1483–1546) probably thought this not so much a fudge as a needless layer, arguing that it was faith alone (rather than a virtuous life of good works) by which one would on judgement day be judged.  Faith then was the cornerstone of salvation in his doctrine of sola fide (faith alone), a rigor which would have pleased John Calvin (1509–1564).  The philosopher Søren Kierkegaard (1813–1855) was not a theologian but his writings had an influence on theological thought and in a nod to Aquinas highlighted the paradox of faith and what he called “leap of faith” as essential to authentic religious life and although he never explicitly discussed the “You don’t have to believe it but you must accept it” school of thought, it does seem implicit in his paradox.

For the bedside table: Karl Barth’s Kirchliche Dogmatik.

Friedrich Schleiermacher (1768–1834) is often styled “the father of modern liberal theology” and to him faith was an experiential relationship with the divine, rooted in a “feeling of absolute dependence.  More conservative theologians didn’t much object to that notion but they probably thought of him something in the vein William Shakespeare (1564–1616) in Julius Caesar (1599) had Caesar say of Cassius: “He thinks too much: such men are dangerous.  John Henry Newman (1801–1890) was one of those conservatives (albeit something of a convert to the cause who had a strange path to Rome) and he wrote much about the development of doctrine and the role of faith in understanding divine truth but it was the Swiss Protestant theologian Karl Barth (1882-1968) whose Kirchliche Dogmatik (Church Dogmatics (in English translation a fourteen-volume work of some six-million words and published between 1932 and 1967) that appeared the modern world’s most ambitious attempt to recover the proclamation of the word of God as the place where God's message of salvation meets sinful man: faith as an act of trust and obedience to God's self-revelation.  Barth’s contribution to pisteology was a rejection of natural theology, emphasizing faith as a response to God's revelation in Jesus Christ; it wasn’t exactly Martin Luther without the anti-Semitism but the little monk’s ghost does loom over those fourteen volumes.  Pius XII (1879-1958; pope 1939-1958), a fair judge of such things, thought Barth the most important theologian since Aquinas.

Barth though was a formalist, writing for other theologians who breathed rarefied intellectual air and he didn’t make pisteology easy or accessible and although Albert Speer (1905–1981; Nazi court architect 1934-1942; Nazi minister of armaments and war production 1942-1945) claimed to have read all fourteen volumes while serving the twenty year sentence (he was lucky to receive) for war crimes and crimes against humanity, (he had more time than most to devote to the task), he did acknowledge the conceptual and textual difficulties.  Barth seems not to have done much for Speer’s faith in God but, being Speer, he took from the six million works what suited him and decided he was atoning for his sins: “There is much that I still cannot comprehend, chiefly because of the terminology and the subject.  But I have had a curious experience.  The uncomprehended passages exert a tranquilizing effect.  With Barth's help I feel in balance and actually, in spite of all that's oppressive, as if liberated.  Speer continued: “I owe to Barth the insight that man’s responsibility is not relieved just because evil is part of his nature. Man is by nature evil and nevertheless responsible.  It seems to me there is a kind of complement to that idea in Plato’s statement that for a man who has committed a wrong ‘there is only one salvation: punishment.’  Plato continues: ‘Therefore it is better for him to suffer this punishment than to escape it; for it sustains man’s inward being.’

For those who want to explore Christocentric pisteology, Barth’s Kirchliche Dogmatik really isn’t a good place to start because his texts are difficult and that’s not a consequence of the English translation; those who have read the original in German make the same point.  Nor will those tempted by his reputation to try one of his shorter works be likely to find an easier path because his style was always one of dense prose littered with words obscure in meaning to all but those who had spent time in divinity departments.  When writing of German Lutheran theologian Isaak August Dorner (1809–1884) in Protestant Theology in the Nineteenth Century (1946) he wrote: “The assertion of a receptivity in man, the Catholic-type conception of the gratia preveniens which runs alongside this receptivity, the mystical culmination of this pisteology, are all elements of a speculative basic approach which can even be seen here, in Dorner.”  Is it any wonder some might confuse pisteology with piscatology (the study of fishing)?

Sunday, November 17, 2024

Now

Now (pronounced nou)

(1) At the present time or moment (literally a point in time).

(2) Without further delay; immediately; at once; at this time or juncture in some period under consideration or in some course of proceedings described.

(3) As “just now”, a time or moment in the immediate past (historically it existed as the now obsolete “but now” (very recently; not long ago; up to the present).

(4) Under the present or existing circumstances; as matters stand.

(5) Up-to-the-minute; fashionable, encompassing the latest ideas, fads or fashions (the “now look”, the “now generation” etc).

(6) In law, as “now wife”, the wife at the time a will is written (used to prevent any inheritance from being transferred to a person of a future marriage) (archaic).

(7) In phenomenology, a particular instant in time, as perceived at that instant.

Pre 900: From the Middle English now, nou & nu from the Old English (at the present time, at this moment, immediately), from the Proto-West Germanic , from the Proto-Germanic nu, from the primitive Indo-European (now) and cognate with the Old Norse nu, the Dutch nu, the German nun, the Old Frisian nu and the Gothic .  It was the source also of the Sanskrit and Avestan nu, the Old Persian nuram, the Hittite nuwa, the Greek nu & nun, the Latin nunc, the Old Church Slavonic nyne, the Lithuanian and the Old Irish nu-.  The original senses may have been akin to “newly, recently” and it was related to the root of new.  Since Old English it has been often merely emphatic, without any temporal sense (as in the emphatic use of “now then”, though that phrase originally meant “at the present time”, and also (by the early thirteenth century) “at once”.  In the early Middle English it often was written as one word.  The familiar use as a noun (the present time) emerged in the late fourteenth century while the adjective meaning “up to date” is listed by etymologists as a “mid 1960s revival” on the basis the word was used as an adjective with the sense of “current” between the late fourteenth and early nineteenth centuries.  The phrase “now and then” (occasionally; at one time and another) was in use by the mid 1400s, “now or never” having been in use since the early thirteenth century.  “Now” is widely used in idiomatic forms and as a conjunction & interjection.  Now is a noun, adjective & adverb, nowism, nowness & nowist are nouns; the noun plural is nows.

Right here, right now: Acid House remix of Greta Thunberg’s (b 2003) How dare you? speech by Theo Rio.

“Now” is one of the more widely used words in English and is understood to mean “at the present time or moment (literally a point in time)”.  However, it’s often used in a way which means something else: Were one to say “I’ll do it now”, in the narrow technical sense that really means “I’ll do it in the near future”.  Even things which are treated as happening “now” really aren’t such as seeing something.  Because light travels at a finite speed, it takes time for it to bounce from something to one’s eye so just about anything one sees in an exercise in looking back to the past.  Even when reading something on a screen or page one’s brain is processing something from a nanosecond (about one billionth of a second) earlier.  For most purposes, “now” is but a convincing (an convenient) illusion and even though, in certain, special sense, everything in the universe is happening at the same time (now) it’s not something that can ever be experienced because of the implications of relativity.  None of this causes many problems in life but among certain physicists and philosophers, there is a dispute about “now” and there are essentially three factions: (1) that “now” happened only once in the history of the known universe and cannot again exist until the universe ends, (2) that only “now” can exist and (3) that “now” cannot ever exist.

Does now exist? (2013), oil & acrylic on canvas by Fiona Rae (b 1963) on MutualArt.

The notion that “now” can have happened only once in the history of our universe (and according to the cosmological theorists variously there may be many universes (some which used to exist, some extant and some yet to be created) or our universe may now be in one of its many phases, each which will start and end with a unique “now”) is tied up with the nature of time, the mechanism upon which “now” depends not merely for definition but also for existence.  That faction deals with what is essentially an intellectual exercise whereas the other two operate where physics and linguistics intersect.  Within the faction which says "now can never exist" there is a sub-faction which holds that to say “now” cannot exist is a bit of a fudge in that it’s not that “now” never happens but only that it can only every be described as a particular form of “imaginary time”; an address in space-time in the past or future.  The purists however are absolutists and their proposition is tied up in the nature of infinity, something which renders it impossible ever exactly to define “now” because endlessly the decimal point can move so that “now” can only ever be tended towards and never attained.  If pushed, all they will concede is that “now” can be approximated for purposes of description but that’s not good enough: there is no now.

nower than now!: Lindsay Lohan on the cover of i-D magazine No.269, September, 2006.

The “only now can exist” faction find tiresome the proposition that “the moment we identify something as happening now, already it has passed”, making the point that “now” is the constant state of existence and that a mechanism like time exists only a thing of administrative convenience.  The “only now can exist” faction are most associated with the schools of presentism or phenomenology and argue only the present moment (now) is “real” and that any other fragment of time can only be described, the past existing only in memory and the future only as anticipation or imagination; “now” is the sole verifiable reality.  They are interested especially in what they call “change & becoming”, making the point the very notion of change demands a “now”: events happen and things become in the present; without a “now”, change and causality are unintelligible.  The debate between the factions hinges often on differing interpretations of time: whether fundamentally it is subjective or objective, continuous or discrete, dynamic or static.  Linguistically and practically, “now” remains central to the human experience but whether it corresponds to an independent metaphysical reality remains contested.

Unlike philosophers, cosmologists probably don’t much dwell on the nature of “now” because they have the “Andromeda paradox” which is one of the consequences of Albert Einstein’s (1879-1955) theory of special relativity.  What the paradox does is illustrate the way “now” is relative and differs for observers moving at different speeds, the effect increasing as distances increase, such as when the point of reference is the Andromeda galaxy, some 2½ million light years distant from Earth.  Under special relativity, what one observer sees and perceives as “now” on Andromeda will, by another, moving at a different relative speed, will perceive as occurring in the past or future.   This can happen at any distance but, outside of computer simulations or laboratories, the effects of relative simultaneity is noticeable (even for relatively slow speeds) only at distance. 

Seated vis-a-vis (literally "face to face"), Lindsay Lohan (b 1986, right) and her sister Aliana (b 1993, left), enjoying a tête-à-tête (literally, "head to head"), La Conversation bakery "& café, West Hollywood, California, April 2012.  Sadly, La Conversation is now closed.

Among the implications of the Andromeda paradox is that although the sisters would have thought their discussion something in the "here and now", to a cosmologist they are looking at each other as they used to be and hearing what each said some time in the past, every slight movement affecting the extent of this.  Because, in a sense, everything in the universe is happening "at the same time", the pair could have been sitting light years apart and spoke what they spoke "at the same time" but because of the speed at which light and sound travel, it's only at a certain distance a "practical" shared "now" becomes possible.  

Saturday, October 12, 2024

Ekpyrosis

Ekpyrosis (pronounced eck-pyh-row-sys)

(1) In modern cosmology, a speculative theory proposing the known universe originated in the collision of two other three-dimensional universes traveling in a hidden fourth dimension. This scenario does not require a singularity at the moment of the Big Bang.

(2) In the philosophy of the Stoic school in Antiquity, the idea that all existence is cyclical in nature and universe is the result of a recurring conflagration in which the all is destroyed and reborn in the same process.

1590s (in English): From the Ancient Greek ἐκπύρωσις (ekpúrōsis) (conflagration, cyclically recurring conflagration in which the universe is destroyed and reborn according to some factions in Stoic philosophy), the construct being the Ancient Greek ἐκ (ek) (out of; from) + πύρωσις (pyrōsis), from πῦρ (pyr) (fire) + -ōsis (the suffix).  While there’s no direct relationship between the modern “big bang theory” and the Stoic’s notion of periodic cosmic conflagration (the idea the universe is periodically destroyed by fire and then recreated), the conceptual similarity is obvious.  The Stoic philosophy reflected the general Greek (and indeed Roman) view of fire representing both destruction and renewal.  In English, ekpyrosis first appeared in the late sixteenth century translations or descriptions of ancient Stoic philosophy, particularly in relation to their cosmological theories and it came to be used either as the Stoics applied it or in some analogous way.  It was one of a number of words which during the Renaissance came to the attention of scholars in the West, a period which saw a revival of interest in ancient Greek and Roman thought, art & architecture and for centuries many of the somewhat idealized descriptions and visions of the epoch were those constructed (sometimes rather imaginatively) during the Renaissance.  The alternative spelling was ecpyrosis.  Ekpyrosis is a noun and ekpyrotic is an adjective; the noun plural is ekpyroses.

In stoic philosophy, ekpyrosis was described sometimes as a recurring, unitary process (the periodic destruction & rebirth of the universe in a single conflagration) and sometimes and the final stage of one existence (destruction) which was the source of a palingenesis (the subsequent rebirth).  Palingenesis was almost certainly a variant of palingenesia (rebirth; regeneration) with the appending of the suffix -genesis (used to suggest “origin; production”).  Palingenesia was a learned borrowing from the Late Latin palingenesia (rebirth; regeneration), from the Koine Greek παλιγγενεσία (palingenesía) (rebirth), the construct being the Ancient Greek πᾰ́λῐν (pálin) (again, anew, once more), ultimately from the primitive Indo-European kwel (to turn (end-over-end); to revolve around; to dwell; a sojourn)) + γένεσις (genesis) (creation; manner of birth; origin, source).  The construct of the suffix was from the primitive Indo-European ǵenh- (to beget; to give birth; to produce”) + -ῐ́ᾱ (-íā) (the suffix used to form feminine abstract nouns).

Lindsay Lohan and her lawyer in court, Los Angeles, December, 2011.

In biology, the word was in the nineteenth century was adopted to describe “an apparent repetition, during the development of a single embryo, of changes that occurred previously in the evolution of its species) came directly from the German Palingenesis (the first papers published in Berlin).  In geology & vulcanology, it was used to mean “regeneration of magma by the melting of metamorphic rocks”) and came from the Swedish palingenes (which, like the German, came from the Greek).  In the study of history, palingenesis could be used to describe (often rather loosely) the recurrence of historical events in the same order, the implication being that was the natural pattern of history which would emerge if assessed over a sufficiently long time.  When such things used to be part of respectable philosophy, it was used to mean “a spiritual rebirth through the transmigration of the soul”, a notion which exists in some theological traditions and it has an inevitable attraction for the new-age set.

The Death of Seneca (1773), oil on canvas by Jacques-Louis David (1748–1825), Petit Palais, Musée Des Beaux-Arts, De La Ville De Paris, France.  Lucius Annaeus Seneca (Seneca the Younger, (circa 4 BC–65 AD)) was one of the best known of the Roman Stoics and the painting is a classic example of the modern understanding of stoicism, Seneca calmly accepting being compelled to commit suicide, condenmed after being implicated in a conspiracy to assassinate the Nero (37-68; Roman emperor  54-68).  The consensus among historians is seems to be Seneca was likely “aware of but not involved in” the plot (a la a number of the Third Reich's generals & field marshals who preferred to await the outcome of the July 1944 plot to assassinate Adolf Hitler (1889-1945; Führer (leader) and German head of government 1933-1945 & head of state 1934-1945) before committing themselves to the cause).  There are many paintings depicting the death of Seneca, most showing him affecting the same air of “resigned acceptance” to his fate.

The Stoics were a group of philosophers whose school of thought was for centuries among the most influential in Antiquity.  Although the word “stoic” is now most often used to refer to someone indifferent to pleasure or pain and who is able gracefully to handle the vicissitudes of life, that’s as misleading as suggesting the Ancient Epicureans were interested only in feasting.  What Stoicism emphasized was living a virtuous life, humans like any part of the universe created and governed by Logos and thus it was essential to at all times remain in harmony with the universe.  Interestingly, although the notion of ekpyrosis was one of the distinctive tenants of the school, there was a Stoic faction which thought devoting much energy to such thoughts was something of a waste of energy and that they should devote themselves to the best way to live, harmony with logos the key to avoiding suffering.  Their ideas live on in notions like “virtue is its own reward” and ultimately more rewarding than indulgence or worldly goods which are mere transitory vanities.

While the speculative theory of an ekpyrotic universe in modern cosmology and the ancient Stoic idea of ekpyrosis both revolve around a cyclical process of destruction and renewal, they differ significantly in detail and the phenomena they describe.  Most significantly, in modern cosmology there’s no conception of this having an underlying motivation, something of great matter in Antiquity.  The modern theory is an alternative to what is now the orthodoxy of the Big Bang theory; it contends the universe did not with a “big bang” (originally a term of derision but later adopted by all) begin from a singular point of infinite density in but rather emerged from the collision of two large, parallel branes (membranes) in higher-dimensional space.  In the mysterious brane cosmology, the universe is imagined as a three- dimensional “brane” within a higher-dimensional space (which tends to be called the “bulk”).  It’s the great, cataclysmic collision of two branes which triggers each defining event in the endless cycle of cosmic evolution.  In common with the Stoics, the process is described as cyclical and after each collusion, the universe undergoes a long period of contraction, followed by another collision that causes a new expansion.  Thus, elements are shared with the “Big Bang” & “Big Crunch” cycles but the critical variations are (1) there’s no conception of a singularity (2) although this isn’t entirely clear according to some, time never actually has to “begin” which critics have called a bit of a “fudge” because it avoids the implications of physical laws breaking down (inherent in the Big Bang’s singularity) and assumes cosmic events occur smoothly (in the sense of physics rather than violence) during brane collisions.

Bust of Marcus Aurelius (121–180; Roman emperor 161-180), Musée Saint-Raymond, Toulouse, France.

Something in the vein of the “philosopher kings” many imagine they’d like to live under (until finding the actual experience less pleasant than they’d hoped), Marcus Aurelius was a Stoic philosopher who has always been admired for his admirable brevity of expression, the stoic world-view encapsulated in his phases such as “Waste no more time arguing about what a good man should be.  Be one.”, “The happiness of your life depends upon the quality of your thoughts.” and “Our life is what our thoughts make it.  Marcus Aurelius was the last emperor of Pax Romana (Roman peace, 27 BC-180 AD), a golden age of Roman imperial power and prosperity.  

To the Stoics of Antiquity, ekpyrosis described the periodic destruction of the universe by a great cosmic fire, followed by its rebirth, fire in the Classical epoch a common symbol both of destruction and creation; the Stoic universe was a deterministic place.  In the metaphysics of the ancients, the notion of fire and the central event was not unreasonable because people for millennia had been watching conflagrations which seemed so destructive yet after which life emerged, endured and flourished and the idea was the same conflagration which wrote finis to all was the same primordial fire from which all that was new would be born.  More to the point however, it would be re-born, the Stoics idea always that the universe would re-emerge exactly as it had been before.  The notion of eternal recurrence doesn’t actually depend on the new being the same as the old but clearly, the Greeks liked things the way they were and didn’t want anything to change.  That too was deterministic because it was Logos which didn’t want anything to change.  The Stoics knew all that had been, all this is and all that would be were all governed by Logos (rational principle or divine reason) and it was this which ensured the balance, order and harmony of the universe, destruction and re-birth just parts of that.  Logos had motivation and that was to maintain the rational, natural order but in modern cosmology there’s no motivation in the laws of physics, stuff just happens by virtue of their operation.

Tuesday, September 17, 2024

Sin-eater

Sin-eater (pronounced sin-ee-ter or sin-ee-tah)

(1) An individual (in the historic texts usually a man) who, by the act of eating a piece of bread laid upon the breast of the corpse (although in many depictions the goods are place on the lid of the coffin (casket)) , absorbs the sins of a deceased, enabling them to “enter the kingdom of heaven”.

(2) Figuratively, as a thematic device in literature, a way to represent themes of guilt, atonement, sacrifice, and societal exclusion (used variously to explore the moral complexities inherent in assuming the sins (or guilt) of another, the act of mercy and the implications of personal damnation.

Late 1600s (although the culture practice long pre-dates evidence of the first use of the term):  The construct was sin + eat +-er.  Sin (in the theological sense of “a violation of divine will or religious law; sinfulness, depravity, iniquity; misdeeds”) was from the Middle English sinne, synne, sunne & zen, from the Old English synn (sin), from the Proto-West Germanic sunnju, from the Proto-Germanic sunjō (truth, excuse) and sundī, & sundijō (sin), from the primitive Indo-European hs-ónt-ih, from hsónts (being, true), implying a verdict of “truly guilty” against an accusation or charge), from hes- (to be) (which may be compared with the Old English sōþ (true).  Eat (in the sense of “to ingest; to be ingested”) was from the Middle English eten, from the Old English etan (to eat), from the Proto-West Germanic etan, from the Proto-Germanic etaną (to eat), from the primitive Indo-European hédti, from hed- (to eat).  The –er suffix was from the Middle English –er & -ere, from the Old English -ere, from the Proto-Germanic -ārijaz, thought most likely to have been borrowed from the Latin –ārius where, as a suffix, it was used to form adjectives from nouns or numerals.  In English, the –er suffix, when added to a verb, created an agent noun: the person or thing that doing the action indicated by the root verb.   The use in English was reinforced by the synonymous but unrelated Old French –or & -eor (the Anglo-Norman variant -our), from the Latin -ātor & -tor, from the primitive Indo-European -tōr.  When appended to a noun, it created the noun denoting an occupation or describing the person whose occupation is the noun.  Sin-eater is a noun and sin-eating is a verb; the noun plural is sin eaters.  The term often appears as “sin eater” but (untypically for English), seemingly not as “sineater”.

The first documented evidence of the term “sin-eater” appears in texts dating from the late seventeenth century but cultural anthropologists believe the actual practice to be ancient and variations of the idea are seen in many societies so the ritual predates the term, the roots apparently in European and British folk traditions, particularly rural England and Wales.  The earliest (authenticated) known documented mention of a sin-eater occurs Remaines of Gentilisme and Judaisme (1686) by English antiquary John Aubrey (1626–1697), in which is described the custom of a person eating “bread and drinking ale” placed on the chest of a deceased person in order that their “many sins” could be eaten, thus allowing the untainted soul to pass to the afterlife, cleansed of “earthly wrongdoings”.   Aubrey would write of a "sin-eater living along the Rosse road" who regularly would be hired to perform the service, describing him as a “gaunt, ghastly, lean, miserable, poor rascal”.  He mentioned also there was a popular belief that sin-eating would prevent the ghost of the deceased from walking the earth, a useful benefit at a time when it was understood ghosts of tormented souls, unable to find rest, haunted the living.  Whether this aspect of the tradition was widespread or a localism (a noted phenomenon in folklore) isn't know.  Interestingly, in rural England and Wales the practice survived the Enlightenment and became more common (or at least better documented) in the eighteenth & nineteenth centuries.  In the turbulent, troubled Middle East, a macabre variation of the sin-eater has been documented.  There, it's reported that a prisoner sentenced to death can bribe the jailors and secure their freedom, another executed in their place, the paperwork appropriately altered.   

Paris Hilton (b 1981, left) and Lindsay Lohan (b 1986, right) discussing their “manifold sins and wickedness” while shopping, Los Angeles, 2004.

The ritual was of interest not only to social anthropologists but also to economic historians because while it was clear sin-eaters did receive payment (either in cash or in-kind (typically food)), there’s much to suggest those so employed were society’s “outcasts”, part of the “underclass” sub-set (beggars, vagrants, vagabonds etc) which is the West was a less formalized thing than something like the Dalits in Hinduism.  The Dalits (better known as the “untouchables”) in the West are often regarded as the “lowest rung” in the caste system but in Hindu theology the point was they were so excluded they were “outside” the system (a tiresome technical distinction often either lost on or ignored by the colonial administrators of the Raj) and relegated to the least desirable occupations.  Being a sin-eater sounds not desirable and theologically that’s right because in absolving the dead of their sins, the sin-eater becomes eternally burdened with the wickedness absorbed.  Presumably, a sin-eater could also (eventually) have their sins “eaten” but because they were from the impoverished strata of society, it was probably unlikely many would be connected to those with the economic resources required to secure such a service.  As a literary device, a sin-eater (often not explicitly named as such) is a character who in some way “takes on” the sins of others and they can be used to represent themes of guilt, atonement, sacrifice, and societal exclusion.  In popular culture, the dark concept is quite popular and there, rather than in symbolism, the role usually is explored with the character being explicating depicted as a “sin-eater”, an example being The Sin Eater (2020) by Megan Campisi (b 1976), a dystopian novel in which a young woman is forced into the role as a punishment.

Nice work if you can get it: The Sin-Eater, Misty Annual 1986.  Misty was a weekly British comic magazine for girls which, unusually, was found also to enjoy a significant male readership.  Published by UK house Fleetway, it existed only between 1978-1980 although Misty Annual appeared until 1986.  The cover always featured the eponymous, raven haired beauty.

There’s the obvious connection with Christianity although aspects of the practice have been identified in cultures where they arose prior to contact with the West.  The novel The Last Sin Eater by born-again US author Francine Rivers (b 1947) was set in a nineteenth century Appalachian community and dealt with sin, guilt & forgiveness, tied to the “atonement of the sins of man” by the crucifixion of Jesus Christ and thematically that was typical of the modern use.  However, the relationship between sin-eating and the Christian ritual of communion is theologically tenuous.  The communion, in which bread symbolizes the body of Christ and wine symbolizes His blood is actually literal in the Roman Catholic Church under the doctrine of transubstantiation which holds that during the sacrament of the Eucharist (or Holy Communion), the bread and wine offered by the priest to the communicants transforms into the body and blood of Christ.  That obviously requires faith to accept because while the appearances of the bread (usually a form of wafer) and wine (ie their taste, texture, and outward properties) remain unchanged, their substance (what truly they are at the metaphysical level) is said to transform into the body and blood of Christ.  Once unquestioned by most (at least publicly), the modern theological fudge from the Vatican is the general statement: “You need not believe it but you must accept it”.

Sin-eating and communion both involve the consumption of food and drink in a symbolic manner.  In sin-eating, a sin-eater consumes food placed near or on the corpse symbolically to “absorb” their sins so the soul of the deceased may pass to the afterlife free from guilt while in the Christian Eucharist, the taking of bread and wine is a ritual to commemorate the sacrifice of Jesus who, on the cross at Golgotha, died to atone for the sins of all mankind.  So the central difference is the matter of who bears the sins.  In sin-eating, that’s the sin-eater who personally takes on the spiritual burden in exchange for a small payment, thus becoming spiritually tainted in order that another may spiritually be cleansed.  In other words, the dead may “out-source” the cost of their redemption in exchange for a few pieces of silver.  In the Christian communion, it’s acknowledged Jesus has already borne the sins of humanity through His crucifixion, the ritual an acknowledgment of His sacrificial act which offered salvation and forgiveness of sin to all who believe and take him into his heart.  One can see why priests were told to discourage sin-eating by their congregants but historically the church, where necessary, adapted to local customs and its likely the practice was in places tolerated.

Thursday, February 8, 2024

Plutonium & Uranium

Plutonium (pronounced ploo-toh-nee-uhm)

A radioactive chemical element that is artificially derived from uranium, plutonium is a highly toxic metallic transuranic element.  It occurs in trace amounts in uranium ores and is produced in a nuclear reactor by neutron bombardment of uranium-238. The most stable and important isotope, plutonium-239, readily undergoes fission and is used as a reactor fuel in nuclear power stations and in nuclear weapons. Symbol: Pu; atomic no: 94; half-life (plutonium 239): 24,360 years; valency: 3, 4, 5, or 6; relative density (alpha modification): 19.84; melting point: 1184°F (640°C); boiling point: 5846°F (3230°C); specific gravity 19.84.  Its longest-lived isotope is Plutonium 244 with a half-life of 77 million years.

1941: The construct was Pluto (the (now dwarf-) planet), +–ium (the element ending suffix from the Latin -um (neuter singular morphological suffix) and based on Latin terms for metals such as ferrum (iron).  The –ium suffix (used most often to form adjectives) was applied as (1) a nominal suffix (2) a substantivisation of its neuter forms and (3) as an adjectival suffix.  It was associated with the formation of abstract nouns, sometimes denoting offices and groups, a linguistic practice which has long fallen from fashion.  In the New Latin, as the neuter singular morphological suffix, it was the standard suffix to append when forming names for chemical elements.  Plutonium was discovered at the University of California, Berkeley and so named because it follows the recently discovered neptunium in the periodic table and, at the time, Pluto followed Neptune in the Solar System.  The name plutonium earlier had been proposed for barium and was used sometimes in this sense early in the nineteenth century.

Pluto was from the Latin Plūtō, from the Ancient Greek Πλούτων (Ploútōn) (god of the underworld”).  In Greek mythology & Roman mythology, Pluto is remembered as the Greco-Roman god of the underworld but the ultimate origin was the Greek Ploutōn (god of wealth), from ploutos (wealth, riches (thought probably used originally in the sense of “overflowing”), from the primitive Indo-European root pleu- (to flow); the alternative Greek name Hades is also related to wealth because it is from beneath the earth that lie valuable metals & precious gems.  Although some have expressed doubt, the accepted history is it was then eleven year old Ms Venetia Burney (1918–2009) who suggested the name Pluto for the newly discovered (then) planet, aware of the procedure apparently because her uncle had earlier nominated Phobos and Deimos as names for the moons of Mars.  In 2006, the humorless International Astronomical Union (IAU) made its scandalous decision to declare, on highly technical grounds, that Pluto was not a planet but a mere dwarf and this inspired the American Dialect Society to coin the verb "to pluto" meaning "to demote or devalue something".

Uranium (pronounced yoo-rey-nee-uhm)

A white, lustrous, radioactive, metallic element, it has compounds used in photography and in coloring glass, the 235 isotope used in atomic and hydrogen bombs and as nuclear fuel in fission reactors.  A radioactive silvery-white metallic element of the actinide series, it occurs in several minerals including pitchblende, carnotite, and autunite.  Symbol: U; atomic no: 92; atomic wt: 238.0289; half-life of most stable isotope (uranium 238): 451 × 109 years; valency: 2-6; relative density: 18.95 (approx.); melting point: 2075°F (1135°C); boiling point: 7473°F (4134°C); specific gravity 18.95.

1789: The construct was Uranus + (the planet) the –ium.  The element was named (using the conventions of Modern Latin) because the discovery of the planet had recently been announced.  Uranus was from the Latin Ūranus, from the Ancient Greek Ορανός (Ouranós), from ορανός (ouranós) (sky, heaven).

Uranus Fudge Factory, 14400 State Hwy Z, St Robert, Missouri 65584, USA.

Nuclear Weapons

Of the first three atomic bombs built in 1945, two used plutonium as fissile material while one used uranium.  Two of the many problems faced in the project were (1) production of uranium of the required purity was slow but a bomb of this type was (relatively) simple to produce and (2) plutonium was more abundant but the engineering to create such a bomb was intricate, the results uncertain.  Two designs were thus concurrently developed: a (relatively) simple trigger-type device and a more complex implosion-type.  Trinity, code-name for the world’s first detonation of a nuclear device (New Mexico, July 1945), was one of the latter, an implosion-type plutonium bomb.  It was chosen because this was a genuine test, there being no certainty it would work whereas the trigger-type uranium device, ultimately dropped on Hiroshima a month later, was never tested because the scientists and engineers had such confidence in its design.  After the war, it was assumed the somewhat inefficient trigger mechanism wouldn’t again be used but technical problems saw production temporarily resumed, these stop-gap A-Bombs remaining in service until 1951.

Models of short and medium-range ballistic missiles at DPRK Annual Flower Show, Pyongyang, April 2013.

Lindsay Lohan in mushroom cloud T-shirt.

It’s no longer certain the uranium-based bomb used again Hiroshima in August 1945 remains a genuine one-off.  It’s certain that in the sixty-odd years since Trinity, every nuclear weapon except the Hiroshima device was plutonium-based but, beginning in 2006, the DPRK (the Democratic People's Republic of Korea (North Korea)) conducted six nuclear tests and, despite advances in monitoring and detection techniques, it’s not clear what material was used although the consensus is all were fission (A-Bombs) and not fusion (H-Bombs) devices.  The tests, by historic standards, were low-yield, suggesting uranium, but this could be misleading because even a failed test of can produce a nuclear blast called a fizzle (when a detonation fails grossly to meet its expected yield).  The DPRK's programme will have had the odd fizzle but then so has every nation at some stage of the process but by historic standards it must be judged a success.  It was hampered by sanctions and international opposition (Beijing and Moscow as unwilling as Western powers to help the hermit kingdom join the nuclear club) but achieved the  necessary technology transfer by swapping ballistic missile blueprints with Pakistan which had detonated it's first fission device in 1998 but lacked a robust delivery system to counter the "nuclear threat" from India which had tested as early as 1974.  That transaction was illustrative of one of the two concerns the West harbours about the DPRK bomb (1) some sort of accident (and that covers everything from an unplanned detonation in some unfortunate place to a missile launch which malfunctions and hits a populated area) and (2) nuclear proliferation which happens because the technology is used by Pyongyang in the barter economy as a trade for something desirable but not available because of sanctions or other trade restrictions.

Thursday, May 4, 2023

Valhalla

Valhalla (pronounced val-hal-uh or vahl-hah-luh)

(1) In Norse mythology, a dwelling in Asgard, the Norse heaven, the hall of Odin, reserved to receive the souls of those who fell in battle and others who died heroic deaths.

(2) In casual use, by extension from the classic meaning, an abode of the gods or afterlife in general.

1696: From the New Latin Vahalla, from the Old Norse Valhöll, the construct being val(r) (the slain in battle (and cognate with the Old English wæl)) + höll (hall).  The heavenly hall in which Odin receives the souls of heroes slain in battle appears often in Norse mythology and the word Vahalla was introduced into English in Archdeacon William Nicolson’s (1655–1727) English Historical Library (1696).  Valr (those slain in battle) was from the Proto-Germanic walaz (source also of Old English wæl (slaughter, bodies of the slain)) from the Old High German wal (battlefield, slaughter), from the primitive Indo-European root wele (to strike, wound (source also of Avestan vareta- (seized, prisoner), the Classical Latin veles (ghosts of the dead), the Old Irish fuil (blood) & the Welsh gwel (wound)). Höll (hall) is from the primitive Indo-European root kel- (to cover, conceal, save).  Nicolson’s work was long known only to scholars and it wasn’t until the word was re-introduced in the eighteenth centuries by antiquaries there was any revival of interest but it was the work of Richard Wagner (1813–1883) in the next century that popularised the Norse myths and, in some circles, made Valhalla a cult.  The familiar figurative sense has been used since 1845.  Vahalla was also spelled Valhall, Walhalla & Walhall; the plural is Valhallas (and not always with the initial capital).

Hermann Burghart's (1834-1901) design of Valhalla and the rainbow bridge for the staging of Das Rheingold, Bayreuth, 1878.

Valhalla is the great hall where the god Odin houses the dead whom he deems worthy of dwelling with him.  In the Old Norse poem Grímnismál (The Song of the Hooded One), the architecture of Valhalla is described as honouring military tradition, the roof of the “gold-bright” Valhalla made from the shields of fallen warriors with their spears its rafters.  Around the many feasting tables are chairs made from breastplates and the gates are guarded by wolves, eagles circling above but there are different depictions and there's no one view of where Valhalla was.  In some Old Norse literature, it’s said to be located in Asgard, the gods’ celestial fortress yet other texts suggest it was underground, one of the many places of the underworld.

The dead who reside in Valhalla, the einherjar (the ɛinˌherjɑz̠, (those who fight alone, literally "army of one")), live on as warriors, fighting among each-other and enjoying vivid adventures during the day.  Yet every evening, their wounds are healed, and, restored to full health, they feat on roasted wild boar (Saehrimnir (from the Old Norse Sæhrímnir of unknown origin)) and a mead from the udder of the goat Heidrun (from the Old Norse Heiðrun of unknown origin), all the while waited on by the same beautiful Valkyries who circled the battlefields on which they were slain.  But the einherjar are doomed because Odin has recruited only the bravest soldiers for he wants them for his army in his struggle against the wolf Fenrir during Ragnarok, a battled which Odin and the einherjar are fated to lose.

Robert Lepage's (b 1957) design of Valhalla for the staging of Das Rheingold, Met Opera, New York, 2010.

Like the mythology of Greek and Roman antiquity, it’s possibly some of what was passed down during the middle ages is just one variation of the original myth(s) and it’s only in the poetry of Icelandic historian Snorri Sturluson (1179–1241) that there’s a statement of the path of the fallen to Valhalla.  Snorri’s Prose Edda (circa 1220), a four-volume work drawn from many sources remains the most complete and extensive collection of the Norse mythology known still to exist but the author was also a lawyer and politician and scholars have noted he wrote long after the old Norse paganism had been replaced by Christianity; there’s the suspicion this may have been an influence in the way he synthesized strands from earlier traditions with Christian teaching.  Snorri said those who fell on the field of battle ascend to Valhalla, while those who die a less heroic death are consigned to hell, the underworld.  That does seem unfair (and probably bad public policy) and elsewhere in the Edda, he’s not above allowing the odd fudge, just as Roman Catholic theologians would invent limbo, their own medieval conjecture to tidy up the margins of God’s mysterious ways.  Snorri makes no attempt to justify his (actually quite blatant) contradictions and it’s thought what he wanted to achieve was a kind of lineal alignment between the pagan ways and the Christian, Valhalla and Hel the same diametric opposite as Heaven and Hell in Christian eschatology.  However, as many surviving fragments from earlier texts attest, the tidy, systematized paganism described by Snorri was not entirely that which had been practiced.

Saturday, March 5, 2022

Urning & Urningin

Urning & Urningin (pronounced ern-ing & ern-ings)

A male homosexual person (obsolete, and when used should be in the historic context of the original meaning, a technically differentiated sense of homosexuals as a “third sex” rather than a variation of the spectrum within the (then) existing two).  The equivalent feminine form was urningin.

1864: From the German Urning (a male homosexual constructed as a third sex (Uranian), the related form being Urnigtum (homosexuality), referring Aphrodite (Ūrania), coined by the German writer Karl Heinrich Ulrichs (1825–1895) in 1864.  By the early twentieth century, except among some writers in German, the word in this sense had largely been supplanted by homosexual.  The link to Aphrodite lies in Plato’s Symposium (circa 385–370 BC), where the goddess Aphrodite, in her heavenly aspect (Ūrania), is described as inspiring a noble form of affection between older and younger men.  In Greek and Roman mythology, what’s described as "the heavenly aspect of Aphrodite", the Greek goddess of beauty and love, Ūrania (and her Roman counterpart Venus) is contrasted with the earthly aspect known as Aphrodite.  Ūrania is also the muse of astronomy.  Originally used by astrologers and astronomers, uranian is now rare, used only poetically.  It was from the Latin Ūrania (the muse of astronomy in Greek mythology) + -an (the suffix forming agent nouns).  Ūrania was from the Ancient Greek Ορν́ (Ouraníā) (muse of astronomy), from οράνιος (ouránios) (of or relating to the sky, celestial, heavenly) (from ορανός (ouranós) (the sky; heaven, home of the gods; the universe) and probably ultimately from the primitive Indo-European hwers- (rain) + -ιος (-ios), the suffix forming adjectives meaning (pertaining to).  Uranical is equally rare.

When writing now of homosexuality, uranism should be described as a particular historical construct.  The suggestion in 1864 was that the Urning (male) and the Urningin (female) homosexuals should be regarded as a third sex, not on any spectrum within the then-accepted binary division of gender.  Despite that, it was an interesting anticipation of the later notion(s) of gender fluidity in that it it encompassed the idea of something feminine inherently within the male body and vice versa.

Psychopathia Sexualis

In English, urning seems to have become widely discussed in the medical profession after it appeared to be in Charles Chaddock's (1961-1936) 1892 translation of the impressively titled Psychopathia Sexualis: eine Klinisch-Forensische Studie (Sexual Psychopathy: A Clinical-Forensic Study, also known as Psychopathia Sexualis, with Especial Reference to the Antipathetic Sexual Instinct: A Medico-forensic Study), published in 1886, a book by an Austro-German psychiatrist with a name of similarly imposing length, Richard Fridolin Joseph Freiherr Krafft von Festenberg auf Frohnberg, genannt von Ebing (1840–1902), work and author respectively cited usually as the more manageable Psychopathia Sexualis by Richard Freiherr von Krafft-Ebing.  In his translation, US neurologist Dr Chaddock set a couple of landmarks in English, one being apparently the first instance in print of the word “bisexual” being used in the sense of humans being sexually attracted to both women and men.  Prior to that "bisexual" was used either to refer to hermaphroditic plants (ie those with both male and female reproductive structures), or to mixed-sex schools (ie co-ed(ucational)) or other institutions, an instance of how meaning-shifts in language can make difficult the reading of historic texts.

Psychopathia Sexualis wasn’t the first publication to explore the topic but the scale and breadth of approach to sexual pathology makes it one of the seminal works in the field.  Although covering a wide range of paraphilias, it was notable for a then quite novel focus on male homosexuality (hence the "antipathetic instinct" in the subtitle).and introduced the newly coined terms "sadism and masochism".  Very much a book of its time, von Krafft-Ebing writings reflected the views of the medical mainstream, distilling Karl Ulrichs' Urning (1825–1895) theory with Bénédict Morel's (1809–1873) theory of degeneration (a handy, all-purpose model for frustrated psychiatrists, degeneration theory held there were psychological which were genetic and could not be cured by a psychiatrist; it could be used to explain any psychological condition).

Nor was there any unanimity of opinion within the profession but the book was influential in psychiatry for decades and it wasn’t until 1973, in preparation for the publication in 1974 of a seventh printing of the second edition of the American Psychiatric Association's (APA) Diagnostic and Statistical Manual of Mental Disorders (DSM) (DSM-II (1968)) that homosexuality ceased to be listed as a category of disorder although the revision was less than activists had hoped, the diagnosis instead becoming a "sexual orientation disturbance".  In the DSM-III (1980), it was again re-classified but it wasn’t until that volume was revised (DSM III-R) in 1987 that homosexuality ceased to be a treatable condition, a position which, in the West, would not everywhere for some years be reflected in legislation.

Aphrodite (1887), oil on canvas by Robert Fowler (1853–1926).

In Europe, one stream of the dissent against prevailing orthodoxy is traced to the mid-nineteenth century writings of Karl Heinrich Ulrichs, a proto gay rights activist trained in law, theology, and history.  Using the nom-de-plume Numa Numantius, during the 1860s, he issued a number of political pamphlets asserting something with strands of the modern view: that some men were born with the spirit of a woman trapped in their bodies, these men constituting a third sex which, in 1864, he named urnings and that those who might now be called lesbians were urningin, a man’s spirit trapped in the body of a woman.  His theories gained little public support but he wasn’t entirely isolated.  In 1869, Hungarian journalist Károli Mária Kertbeny (1884-1882) coined the terms “homosexual” and “homosexuality” in a political treatise against the Prussian penal code which criminalized the behavior among men, arguing the condition was inborn and unchangeable, one of many normal variations in the human condition.

Richard von Krafft-Ebing, reviewed the literature and synthesised.  Describing homosexuality as a “degenerative” disorder, he adopted Kertbeny’s terminology, but not his notion of the normal; in Psychopathia Sexualis he viewed unconventional sexual behaviors through the lens of nineteenth century Darwinian theory: non-procreative sexual behaviors, masturbation included, were forms of psychopathology and his most intriguing mix of ancient and modern was that in being born with a homosexual predisposition ("born like this" in the 21C vernacular), the victim was a victim of congenital disease.  His views of homosexuality as a psychiatric disorder were influential but even those who disagreed cemented the linguistic legacy, term “homosexual” quickly adopted as the standard term in the medical lexicon.  After the publication of Psychopathia Sexualis, views on the mater coalesced but prevailing opinion shifted little.  In opposition von Krafft-Ebing, German psychiatrist Magnus Hirschfeld (1868–1935) emerged as a neo-Urlichian, publishing and lecturing in support of a normative view of homosexuality and underground movements existed in many cities, some tolerated. 

Stable Diffusion's AI generation of Lindsay Lohan as Aphrodite, rising from the foam.

It was the founder of psychoanalysis, Austrian neurologist Sigmund Freud (1856–1939), who offered the theory which would capture the popular imagination.  Disagreeing with both Hirschfeld’s theories of normal variation and Krafft-Ebing’s of pathology, Freud believed all were born with bisexual tendencies and therefore manifestations of homosexuality could be a normal phase of heterosexual development and an innate bisexuality allowed no possibility of a separate “third sex” constructed by Hirschfeld.  Nor could the “degenerative condition” described by von Krafft-Ebing be maintained because, inter alia, it was “found in people whose efficiency is unimpaired, and who are indeed distinguished by specially high intellectual development and ethical culture”.  That may amuse some now but, within Freudian theory, the internal logic is perfect, manifestations of adult homosexual behavior being caused by “arrested” psychosexual development, a theory of immaturity.

Yet, despite the interest aroused, after Freud’s death in 1939, most psychoanalysts came to view homosexuality as pathological and, in the massively expanded universities of the post-war years, research in the field exploded and sexology evolved from a professional niche to a well-funded discipline, the academic work increasingly augmented by popular publications aimed at the general reader as well as the profession.  By the mid-twentieth century, the intellectual centre of psychiatry had shifted from Europe to the United States and it was there that published what would quickly become the most influential publication in the field, the APA's DSM.  When the first edition (DSM-I) was released in 1952, homosexuality was classified as a “sociopathic personality disturbance” and this was the orthodoxy until revised in the DSM-II (1968) when the term changed to “sexual deviation”, a nuance probably better understood by the profession than the patients although "sexual deviant" became a popular phrase (even a cheerful self-descriptor among certain consenting adults) and one applied to many activities.  So pervasive was it the phrase was used even in legislation passed in the mid 1980s by a National Party government in the Australian state of Queensland, a place which for gay men didn't at the time live up to the name.  Just as in the US there was an identifiably distinct political culture south of the Mason-Dixon Line, so things were different north of the New South Wales (NSW) border.

Aphrodite Urania: A casting of the Trono Ludovisi, displayed in the Pushkin State Museum of Fine Arts, Moscow, Russia.

Discovered in 1887 in Rome's Villa Ludovisi, the Trono Ludovisi (Throne of Ludovisi) is a carving in Parian marble in the severe style of the early classical period and dates from 490-450 BC.  The still-used description as a “throne” is a relic of the belief by nineteenth century archaeologists the three-sided relief was from the seat of a statue of a colossal god but the contemporary view is it was probably a fender in front of an altar.  There is also controversy about the creation, most experts taking the view it was carved by a Greek master sculptor who worked in Italy, one perhaps of the Neo-Attic school, while the alternative theory suggest it may be of Ionian origin.  What all agree is it seems to depict the birth of the goddess Aphrodite, born from the foam of the sea on the beaches of Cyprus, the pebbles & sand visible beneath the feet her attendants.  On the side panels, other women sit playing pipes and burning incense, befitting the arrival of a goddess.  However, what clouds that is that representations of the birth of Aphrodite are almost unknown in Classical art and even the depiction of semi-nude female figures was, in the era, also unprecedented in large-scale Greek sculpture.  Trono Ludovisi is now on permanent display in the National Museum, Palazzo Altemps, Rome.

Mutual hatred.

William F Buckley (1925–2008, left) & Gore Vidal (1925–2012, right) and in one of their infamous ten debates during the 1968 US presidential campaign, August, 1968.  Their exchanges, with phrases like "you queer" and "crypto-Nazi", captured the spirit of acrimony but structurally, the debates were influential in the way the television networks would package politics as entertainment, the legacy visible today and not just on Fox News.

One implication of sexologists becoming more numerous and active was a change to the very nature of research.  Whereas psychiatrists and other clinicians drew conclusions from a skewed sample of patients seeking treatment for homosexuality or other difficulties, sexologists undertook field studies for which were recruited large numbers of subjects in the general population, most of whom had never presented for psychiatric treatment.  The most famous of the reports published both generated headlines and became best-sellers.  The Kinsey Reports (of men (1948) and women (1953)), with sample sizes in the thousands, found homosexuality to be more common in the general population than the psychiatrists had claimed although the still oft-quoted 10% “statistic” is now discredited and thought a significant over-estimate.  Ford and Beach, looking at both diverse cultures and animal behavior, confirmed Kinsey’s view homosexuality was more common than psychiatry historically has maintained and that it was found regularly in nature.  In a number of smaller surveys (of which US psychologist Evelyn Hooker’s (1907–1996) 1956 paper The Adjustment of the Male Overt Homosexual was representative in concept if not detail), no evidence was found to suggest gay men were any more prone to severe psychological disturbances than anyone else although some did concede the perception they were "highly strung" and "dramatic" was supported by observational studies but that may be influenced by depictions in popular culture and thus perhaps even "learned or imitative".  Still, even by the twenty-first century and among liberal circles it wasn't always something one wished to be defined by.  When in 2009 the author Gore Vidal (in the language of an earlier age "a confessed homosexual") was referred to in a gathering by a well-meaning society lady as: "one of the great homosexuals of our time.", his response was to ask:  "Will someone please get this cunt out of my sight?"

The wealth of research, coupled with an increasingly strident gay activism and generational change within the APA induced a shift, the awareness now that the real psychological damage being done might be the stigma caused by the “homosexuality” diagnosis rather than the dynamics of the predilection.  Nevertheless, in 1973 when the APA met to discuss the matter, planning both for a revised DSM II and the new DSM-III, it was pondered whether “homosexuality” should be included in the APA nomenclature.  Implicit in this was the very question of what could be said to constitute a mental disorder so it was a matter of importance beyond the immediate issue.  Not wishing fundamentally to change the parameters of diagnosis, the APA came up with a masterful fudge, issuing a statement saying it had “reviewed the characteristics of the various mental disorders and concluded that, with the exception of homosexuality and perhaps some of the other 'sexual deviations', they all regularly caused subjective distress or were associated with generalized impairment in social effectiveness of functioning”.  Happy with the elegance of the loophole, the APA’s Board of Trustees (BoT) voted to remove homosexuality from the DSM.

Less happy were many clinicians who insisted on a vote of the whole membership.  The APA agreed and the decision to remove was upheld by a 58% majority of 10,000 voting members although technically, the question on which they voted was not whether homosexuality should remain a diagnosis but whether to support or oppose the BoT’s decision and, by extension, the scientific process created to make the determination.  It seemed a fine distinction and the BoT’s decision did anyway not immediately end psychiatry’s pathologizing of some presentations of homosexuality.  Instead, a revision to the DSM-II text contained a new diagnosis: "Sexual Orientation Disturbance" (SOD).  SOD (one does wonder if the condition was so-named as some sort of in-joke among the DSM's editors) defined homosexuality as an illness if an individual with same-sex attractions found it distressing and wished to be "cured", an important difference which changed the emphasis from condition to consequence.  That of course implied a future for what came to be known as sexual conversion therapy which has consequences of its own and, presumably, meant anyone unhappy with being heterosexual could seek treatment in an attempt to become homosexual; the literature is silent of that so presumably there was no demand.

SOD was replaced in DSM-III (1980) by a new category called “Ego Dystonic Homosexuality” (EDH) but it was increasingly obvious both SOD and EDH were political fudges to fix an immediate problem and it was not sustainable to maintain a diagnostic criteria under which any identity disturbance could be considered a psychiatric disorder.  The generational shift had happened and EDH was deleted from the revised DSM-III-R, in 1987.  Officially, the DSM now regarded homosexuality as a normal variant of the human condition, essentially what was thought in 1973 but couldn’t then be said.  It didn’t mean the end of debate but did mean those individuals and institutions determined still to discriminate could no longer cite the DSM in any claimed medical or scientific rationale.