Showing posts sorted by date for query Phenomenology. Sort by relevance Show all posts
Showing posts sorted by date for query Phenomenology. Sort by relevance Show all posts

Wednesday, May 14, 2025

Psychache

Psychache (pronounced sahyk-eyk)

Psychological pain, especially when it becomes unbearable, producing suicidal thoughts.

1993: The construct was psyche- + ache.  Psychache was coined by US clinical psychologist Dr Edwin Shneidman (1918-2009) and first appeared in his book Suicide as Psychache: A Clinical Approach to Self-Destructive Behavior (1993).  The prefix psych- was an alternative form of psycho-.  Psycho was from the Ancient Greek ψχο- (psūkho-), a combining form of ψυχή (psukh) (soul).  Wit was used with words relating to the soul, the mind, or to psychology.  Ache was from the Middle English verb aken & noun ache (noun), from the Old English verb acan (from the Proto-West Germanic akan, from the Proto-Germanic akaną (to ache)) and the noun æċe (from the Proto-West Germanic aki, from the Proto-Germanic akiz), both from the primitive Indo-European heg- (sin, crime).  It was cognate with the Saterland Frisian eeke & ääke (to ache, fester), the Low German aken, achen & äken (to hurt, ache), the German Low German Eek (inflammation), the North Frisian akelig & æklig (terrible, miserable, sharp, intense), the West Frisian aaklik (nasty, horrible, dismal, dreary) and the Dutch akelig (nasty, horrible).  Historically the verb was spelled ake, and the noun ache but the spellings became aligned after Dr Johnson (Samuel Johnson (1709-1784)) published A Dictionary of the English Language (1755), the lexicographer mistakenly assuming it was from the Ancient Greek χος (ákhos) (pain) due to the similarity in form and meaning of the two words.  As a noun, ache meant “a continuous, dull pain (as opposed to a sharp, sudden, or episodic pain) while the verb was used to mean (1) to have or suffer a continuous, dull pain, (2) to feel great sympathy or pity and (3) to yearn or long for someone or something.  Pyscheache is a noun

Psychache is a theoretical construct used by clinical suicidologists and differs from psychomachia (conflict of the soul).  Psychomachia was from the Late Latin psӯchomachia, the title of a poem of a thousand-odd lines (circa 400) by Roman Christian poet Prudentius (Aurelius Prudentius Clemens; 348-circa 412), the construct being the Ancient Greek Greek psukhē (spirit) + makhē (battle).  The fifth century poem Psychomachia (translated usually as “Battle of Spirits” or “Soul War”) explored a theme familiar in Christianity: the eternal battle between virtue & vice (onto which can be mapped “right & wrong”, “good & evil” etc) and culminated in the forces of Christendom vanquishing pagan idolatry to the cheers of a thousand Christian martyrs.  An elegant telling of an allegory familiar in early Christian literature and art, Prudentius made clear the battle was one which happened in the soul of all people and thus one which all needed to wage, the outcome determined by whether the good or evil in them proved stronger.  The poem’s characters include Faith, Hope, Industry, Sobriety, Chastity, Humility & Patience among the good and Pride, Wrath, Paganism, Avarice, Discord, Lust & Indulgence in the ranks of the evil but scholars of literature caution that although the personifications all are women, in Latin, words for abstract concepts use the feminine grammatical gender and there’s nothing to suggest the poet intended us to read this as a tale of bolshie women slugging it out.  Of interest too is the appearance of the number seven, so familiar in the literature and art of Antiquity and the Medieval period as well as the Biblical texts but although Prudentius has seven virtues defeat seven vices, the characters don’t exactly align with either the canonical seven deadly sins, nor the three theological and four cardinal virtues.  In modern use, the linguistic similarity between psychache and psychomachia has made the latter attractive to those seduced by the (not always Germanic) tradition of the “romance of suicide”.

A pioneer in the field of suicidology, Dr Shneidman’s publication record was indicative of his specialization.

Dr Edwin Shneidman (1918-2009) was a clinical psychologist who practiced as a thanatologist (a practitioner in the field of thanatology (the scientific study of death and the practices associated with it, including the study of the needs of the terminally ill and their families); the construct of thanatology being thanato- (from the Ancient Greek θάνατος (thánatos) (death)) + -logy.  The suffix -ology was formed from -o- (as an interconsonantal vowel) + -logy.  The origin in English of the -logy suffix lies with loanwords from the Ancient Greek, usually via Latin and French, where the suffix (-λογία) is an integral part of the word loaned (eg astrology from astrologia) since the sixteenth century.  French picked up -logie from the Latin -logia, from the Ancient Greek -λογία (-logía).  Within Greek, the suffix is an -ία (-ía) abstract from λόγος (lógos) (account, explanation, narrative), and that a verbal noun from λέγω (légō) (I say, speak, converse, tell a story).  In English the suffix became extraordinarily productive, used notably to form names of sciences or disciplines of study, analogous to the names traditionally borrowed from the Latin (eg astrology from astrologia; geology from geologia) and by the late eighteenth century, the practice (despite the disapproval of the pedants) extended to terms with no connection to Greek or Latin such as those building on French or German bases (eg insectology (1766) after the French insectologie; terminology (1801) after the German Terminologie).  Within a few decades of the intrusion of modern languages, combinations emerged using English terms (eg undergroundology (1820); hatology (1837)).  In this evolution, the development may be though similar to the latter-day proliferation of “-isms” (fascism; feminism et al).

Death and the College Student: A Collection of Brief Essays on Death and Suicide by Harvard Youth (1973) by Dr Edwin Shneidman.  Dr Shneidman wrote many papers about the prevalence of suicide among college-age males, a cross-cultural phenomenon.

Dr Shneidman was one of the seminal figures in the discipline of suicidology, in 1968 founding the AAS (American Association of Suicidology) and the principal US journal for suicide studies: Suicide and Life-Threatening Behavior.  The abbreviation AAS is in this context used mostly within the discipline because (1) it is a specialized field and (2) there are literally dozens of uses of “AAS”.  In Suicide as Psychache: A Clinical Approach to Self-Destructive Behavior (1993) he defined psychache as “intense psychological pain—encompassing hurt, anguish, and mental torment”, identifying it as the primary motivation behind suicide, his theory being that when psychological pain becomes unbearable, individuals may perceive suicide as their only escape from torment.

Although since Suicide as Psychache: A Clinical Approach to Self-Destructive Behavior appeared in 1993 there have been four editions of American Psychiatric Association's (APA) Diagnostic and Statistical Manual of Mental Disorders (DSM), “psychache” has never appeared in the DSM.  That may seem an anomaly given much in the DSM revolves around psychological disturbances but the reason is technical.  What the DSM does is list and codify diagnosable mental disorders (depression, schizophrenia. bipolar disorder et al), classifying symptoms and behaviors into standardized categories for diagnosis and treatment planning.  By contrast, psychache is not a clinical diagnosis; it is a theoretical construct in suicidology which is used to explain the subjective experience of psychological pain that can lead to patients taking their own lives.  It thus describes an emotional state rather than a psychiatric disorder.

Lindsay Lohan and her lawyer in court, Los Angeles, December, 2011.

Despite that, mental health clinicians do actively use the principles of psychache, notably in suicide risk assessment and prevention and models have been developed including a number of “psychache scales”, self-reporting tools used to generate a metric measuring the intensity of psychological pain (categorized with headings such as shame, guilt, despair et al).  The approaches do in detail differ but most follow Dr Shneidman’s terminology in that the critical threshold is the point at which the patient’s pain becomes unbearable or inescapable and the objective is either to increase tolerance for distress or reframe troublesome thoughts.  Ultimately, the purpose of tools is to improve suicide risk assessments and reduce suicide rates.

DSM-5 (2013).

Interestingly, Suicidal Behavior Disorder (SBD) was introduced in Section III of the DSM-5 (2013) under “Conditions for Further Study”.  Then, SBD chiefly was characterized by a self-initiated sequence of behaviors believed at the time of initiation to cause one’s own death and occurring in the last 24 months.  That of course sounds exact but the diagnostic criteria in the DSM are written like that and the purpose of inclusion in the fifth edition was to create a framework so systematically, empirical studies related to SBD could be reviewed so primary research themes and promising directions for future research could be identified.  Duly, over the following decade that framework was explored but the conclusion was reached there seemed to be little utility in the clinical utility of SBD as a device for predicting future suicide and that more research was needed to understand measurement of the diagnosis and its distinctiveness from related disorders and other self-harming behaviors.  The phase “more research is required” must be one of the most frequently heard among researchers.

In the usually manner in which the APA allowed the DSM to evolve, what the DSM-5s tentative inclusion of SBD did was attempt to capture suicidality as a diagnosis rather than a clinical feature requiring attention.  SBD was characterized by a suicide attempt within the last 24 months (Criterion A) and that was defined as “a self-initiated sequence of behaviors by an individual who, at the time of initiation, expected that the set of actions would lead to his or her own death”.  That sounds uncontroversial but what was significant was the act could meet the criteria for non-suicidal self-injury (ie self-injury with the intention to relieve negative feelings or cognitive state in order to achieve a positive mood state (Criterion B) and cannot be applied to suicidal ideation or preparatory acts (Criterion C).  Were the attempt to have occurred during a state of delirium or confusion or solely for political or religious objectives, then SBD is ruled out (Criteria D & E).  SBD (current) is given when the suicide attempt occurred within the last 12 months, and SBD (in early remission), when it has been 12-24 months since the last attempt.  It must be remembered that while a patient’s behavior(s) may overlap across a number of the DSM’s diagnosises, the AMA’s committees have, for didactic purposes, always preferred to “silo” the categories.

DSM-5-TR (2022).

When in 2022 the “text revision” of the DSM-5 (DSM-5-TR) was released, SBD was removed as a condition for further study in Section III and moved to “Other Conditions That May Be a Focus of Clinical Attention” in Section II. The conditions listed in this section are intended to draw to attention of clinicians to the presence and breadth of additional issues routinely encountered in clinical practice and provide a procedure for their systematic documentation.  According to the APA’s editorial committee, the rationale for the exclusion of SBD from the DSM-5-TR was based on concerns the proposed disorder did not meet the criteria for a mental disorder but instead constituted a behavior with diverse causes and while that distinction may escape most of us, within the internal logic of the history of the DSM, that’s wholly consistent.  At this time, despite many lobbying for the adoption of a diagnostic entity for suicidal behavior, the APA’s committees seem still more inclined to conceptualize suicidality as a symptom rather than a disorder and despite discussion in the field of suicidology about whether suicide and related concepts like psychache should be treated as stand-alone mental health issues, that’s a leap which will have to wait, at least until a DSM-6 is published.

How to and how not to: Informatie over Zorgvuldige Levensbeëindiging (Information about the Careful Ending of Life, 2008) by Stichting Wetenschappelijk Onderzoek naar Zorgvuldige Zelfdoding (The Foundation for Scientific Research into Careful Suicide) (left) and How Not to Kill Yourself: A Phenomenology of Suicide (2023) by Clancy Martin (right).

Informatie over Zorgvuldige Levensbeëindiging (Information about the Careful Ending of Life, 2008) was published by a group of Dutch physicians & and researchers; it contained detailed advice on methods of suicide available to the general public, the Foundation for Scientific Research into Careful Suicide arging “a requirement exists within society for responsible information about an independent and dignified ending of life.”  It could be ordered only from the foundation’s website and had the advantage that whatever might be one’s opinion on the matter, it was at least written by physicians and scientists and thus more reliable than some of the “suicide guides” which are sometimes found on-line.  At the time research by the foundation had found that despite legislation in the Netherlands which permit doctors (acting within specific legal limits) to assist patient commit suicide, there were apparently several thousand cases each year of what it termed “autoeuthanasia” in which no medical staff directly were involved.  Most of these cases involved elderly or chronically ill patients who refused food and fluids and it was estimated these deaths happened at about twice the rate of those carried out under the euthanasia laws.  Since then the Dutch laws have been extended to included those who have no serious physical disease or are suffering great pain; there are people who simply no longer wish to live, something like the tragic figure in Blue Öyster Cult’s (Don't Fear) The Reaper (1976) © Donald Roeser (b 1947):

Came the last night of sadness
And it was clear she couldn't go on
Then the door was open and the wind appeared
The candles blew then disappeared
The curtains flew then he appeared
Saying don't be afraid

There is a diverse literature on various aspects of suicide (tips and techniques, theological & philosophical interpretations, cross-cultural attitudes, history of its treatment in church & secular law etc) and some are quite personal, written variously by those who later would kill themselves or those who contemplated or attempted to take their own lives.  In How Not to Kill Yourself: A Phenomenology of Suicide (2023) by Canadian philosopher Clancy Martin (b 1967), it was revealed the most recent of his ten suicide attempts was “…in his basement with a dog leash, the consequences of which he concealed from his wife, family, co-workers, and students, slipping back into his daily life with a hoarse voice, a raw neck and series of vague explanations.

BKA (the Bundeskriminalamt, the Federal Criminal Police Office of the FRG (Federal Republic of Germany (the old West Germany)) mug shots of the Red Army Faction's Ulrike Meinhof (left) and Gudrun Ensslin (right).

The song (Don't Fear) The Reaper also made mention of William Shakespeare's (1564–1616) Romeo and Juliet (1597) and in taking her own life (using her dead lover’s dagger) because she doesn’t want to go on living without him, Juliette joined the pantheon of figures who have made the tragedy of suicide seem, to some, romantic.  Politically too, suicide can grant the sort of status dying of old age doesn’t confer, the deaths of left-wing terrorists Ulrike Meinhof (1934–1976) and Gudrun Ensslin (1940–1977) of the West German Red Army Faction (the RAF and better known as the “Baader-Meinhof gang”) both recorded as “suicide in custody” although the circumstances were murky.  In an indication of the way moral relativities aligned during the high Cold War, the French intellectuals Jean-Paul Sartre (1905–1980) and Simone de Beauvoir (1908–1986) compared their deaths to the worst crimes of the Nazis but sympathy for violence committed for an “approved” cause was not the exclusive preserve of the left.  In July, 1964, in his speech accepting the Republican nomination for that year’s US presidential election, proto-MAGA Barry Goldwater (1909–1998) concluded by saying: “I would remind you that extremism in the defense of liberty is no vice!  And let me remind you also that moderation in the pursuit of justice is no virtue!  The audience response to that was rapturous although a few months later the country mostly didn’t share the enthusiasm, Lyndon Johnson (LBJ, 1908–1973; US president 1963-1969) winning the presidency in one of the greatest landslides in US electoral history.  Given the choice between crooked old Lyndon and crazy old Barry, Americans preferred the crook.

Nor was it just politicians and intellectuals who could resist the appeal of politics being taken to its logical “other means” conclusion, the Canadian singer-songwriter Leonard Cohen (1934-2016) during the last years of the Cold War writing First We Take Manhattan (1986), the lyrics of which were open to interpretation but clarified in 1988 by the author who explained: “I think it means exactly what it says.  It is a terrorist song.  I think it's a response to terrorism.  There's something about terrorism that I've always admired.  The fact that there are no alibis or no compromises.  That position is always very attractive.   Even in 1988 it was a controversial comment because by then not many outside of undergraduate anarchist societies were still romanticizing terrorists but in fairness to the singer the coda isn’t as often published: “I don't like it when it's manifested on the physical plane – I don't really enjoy the terrorist activities – but Psychic Terrorism.

First We Take Manhattan (1986) by Leonard Cohen

They sentenced me to twenty years of boredom
For tryin' to change the system from within
I'm coming now, I'm coming to reward them
First we take Manhattan, then we take Berlin
 
I'm guided by a signal in the heavens
I'm guided by this birthmark on my skin
I'm guided by the beauty of our weapons
First we take Manhattan, then we take Berlin
 
I'd really like to live beside you, baby
I love your body and your spirit and your clothes
But you see that line there moving through the station?
I told you, I told you, told you, I was one of those
 
Ah you loved me as a loser, but now you're worried that I just might win
You know the way to stop me, but you don't have the discipline
How many nights I prayed for this, to let my work begin
First we take Manhattan, then we take Berlin
 
I don't like your fashion business, mister
And I don't like these drugs that keep you thin
I don't like what happened to my sister
First we take Manhattan, then we take Berlin
 
I'd really like to live beside you, baby
I love your body and your spirit and your clothes
But you see that line there moving through the station?
I told you, I told you, told you, I was one of those



First We Take Manhattan performed by Jennifer Warnes (b 1947), from the Album Famous Blue Raincoat (1986). 

Whatever they achieved in life, it was their suicides which lent a lingering allure to German-American ecofeminist activist Petra Kelly (1947–1992) & the doomed poet American poet Sylvia Path (1932-1963) and the lure goes back for millennia, the Roman Poet Ovid (Publius Ovidius Naso; 43 BC–17 AD) in his Metamorphoses telling an ancient Babylonian tale in which Pyramus, in dark despair, killed herself after finding her young love lifeless.  Over the centuries it’s been a recurrent trope but the most novel take was the symbolic, mystical death in Richard Wagner's (1813–1883) Tristan und Isolde (1865).  Mortally wounded in a duel before the final act, Tristan longs to see Isolde one last time but just as she arrives at his side, he dies in her arms.  Overwhelmed by love and grief, Isolde sings the famous Liebestod (Love-Death) and dies, the transcendent aria interpreted as the swansong which carries her to join Tristan in mystical union in the afterlife.  This, lawyers would call a “constructive suicide”.

Austrian soprano Helga Dernesch (b 1939) in 1972 performing the Liebestod aria from Wagner’s Tristan und Isolde with the Berlin Philharmonic under Herbert von Karajan (1908–1989).

While she didn’t possess the sheer power of the greatest of the Scandinavian sopranos who in the mid-twentieth century defined the role, Dernesch brought passion and intensity to her roles and while, on that night in 1972, the lushness of what Karajan summoned from the strings was perhaps a little much, her Liebestod was spine-tingling and by then, Karajan had been forgiven for everything.  Intriguingly, although Tristan und Isolde is regarded as one of the great monuments to love, in 1854 Wagner had written to the Hungarian composer Franz Liszt (1811–1886) telling him:

As I have never in life felt the real bliss of love, I must erect a monument to the most beautiful of all my dreams, in which, from beginning to end, that love shall be thoroughly satiated.  I have in my head ‘Tristan and Isolde’, the simplest but most full-blooded musical concepion; with the ‘black flag’ which floats at the end of it I shall cover myself to die.

It’s not known whether Listz reflected on this apparent compositional self-medication for psychache after in 1870 learning from his morning newspaper his daughter Cosima (1837-1930) was to be married to Wagner (then 24 years her senior) but because she’d been for some seven years conducting an adulterous affair with the German the news may not have been unexpected.  He was aware Cosmia’s daughter (Isolde Beidler (1865–1919)) had been fathered not by her then husband (the German conductor Hans von Bülow (1830–1894)) but by Wagner and her second marriage proved happier than the first so there was that.

Sunday, November 17, 2024

Now

Now (pronounced nou)

(1) At the present time or moment (literally a point in time).

(2) Without further delay; immediately; at once; at this time or juncture in some period under consideration or in some course of proceedings described.

(3) As “just now”, a time or moment in the immediate past (historically it existed as the now obsolete “but now” (very recently; not long ago; up to the present).

(4) Under the present or existing circumstances; as matters stand.

(5) Up-to-the-minute; fashionable, encompassing the latest ideas, fads or fashions (the “now look”, the “now generation” etc).

(6) In law, as “now wife”, the wife at the time a will is written (used to prevent any inheritance from being transferred to a person of a future marriage) (archaic).

(7) In phenomenology, a particular instant in time, as perceived at that instant.

Pre 900: From the Middle English now, nou & nu from the Old English (at the present time, at this moment, immediately), from the Proto-West Germanic , from the Proto-Germanic nu, from the primitive Indo-European (now) and cognate with the Old Norse nu, the Dutch nu, the German nun, the Old Frisian nu and the Gothic .  It was the source also of the Sanskrit and Avestan nu, the Old Persian nuram, the Hittite nuwa, the Greek nu & nun, the Latin nunc, the Old Church Slavonic nyne, the Lithuanian and the Old Irish nu-.  The original senses may have been akin to “newly, recently” and it was related to the root of new.  Since Old English it has been often merely emphatic, without any temporal sense (as in the emphatic use of “now then”, though that phrase originally meant “at the present time”, and also (by the early thirteenth century) “at once”.  In the early Middle English it often was written as one word.  The familiar use as a noun (the present time) emerged in the late fourteenth century while the adjective meaning “up to date” is listed by etymologists as a “mid 1960s revival” on the basis the word was used as an adjective with the sense of “current” between the late fourteenth and early nineteenth centuries.  The phrase “now and then” (occasionally; at one time and another) was in use by the mid 1400s, “now or never” having been in use since the early thirteenth century.  “Now” is widely used in idiomatic forms and as a conjunction & interjection.  Now is a noun, adjective & adverb, nowism, nowness & nowist are nouns; the noun plural is nows.

Right here, right now: Acid House remix of Greta Thunberg’s (b 2003) How dare you? speech by Theo Rio.

“Now” is one of the more widely used words in English and is understood to mean “at the present time or moment (literally a point in time)”.  However, it’s often used in a way which means something else: Were one to say “I’ll do it now”, in the narrow technical sense that really means “I’ll do it in the near future”.  Even things which are treated as happening “now” really aren’t such as seeing something.  Because light travels at a finite speed, it takes time for it to bounce from something to one’s eye so just about anything one sees in an exercise in looking back to the past.  Even when reading something on a screen or page one’s brain is processing something from a nanosecond (about one billionth of a second) earlier.  For most purposes, “now” is but a convincing (an convenient) illusion and even though, in certain, special sense, everything in the universe is happening at the same time (now) it’s not something that can ever be experienced because of the implications of relativity.  None of this causes many problems in life but among certain physicists and philosophers, there is a dispute about “now” and there are essentially three factions: (1) that “now” happened only once in the history of the known universe and cannot again exist until the universe ends, (2) that only “now” can exist and (3) that “now” cannot ever exist.

Does now exist? (2013), oil & acrylic on canvas by Fiona Rae (b 1963) on MutualArt.

The notion that “now” can have happened only once in the history of our universe (and according to the cosmological theorists variously there may be many universes (some which used to exist, some extant and some yet to be created) or our universe may now be in one of its many phases, each which will start and end with a unique “now”) is tied up with the nature of time, the mechanism upon which “now” depends not merely for definition but also for existence.  That faction deals with what is essentially an intellectual exercise whereas the other two operate where physics and linguistics intersect.  Within the faction which says "now can never exist" there is a sub-faction which holds that to say “now” cannot exist is a bit of a fudge in that it’s not that “now” never happens but only that it can only every be described as a particular form of “imaginary time”; an address in space-time in the past or future.  The purists however are absolutists and their proposition is tied up in the nature of infinity, something which renders it impossible ever exactly to define “now” because endlessly the decimal point can move so that “now” can only ever be tended towards and never attained.  If pushed, all they will concede is that “now” can be approximated for purposes of description but that’s not good enough: there is no now.

nower than now!: Lindsay Lohan on the cover of i-D magazine No.269, September, 2006.

The “only now can exist” faction find tiresome the proposition that “the moment we identify something as happening now, already it has passed”, making the point that “now” is the constant state of existence and that a mechanism like time exists only a thing of administrative convenience.  The “only now can exist” faction are most associated with the schools of presentism or phenomenology and argue only the present moment (now) is “real” and that any other fragment of time can only be described, the past existing only in memory and the future only as anticipation or imagination; “now” is the sole verifiable reality.  They are interested especially in what they call “change & becoming”, making the point the very notion of change demands a “now”: events happen and things become in the present; without a “now”, change and causality are unintelligible.  The debate between the factions hinges often on differing interpretations of time: whether fundamentally it is subjective or objective, continuous or discrete, dynamic or static.  Linguistically and practically, “now” remains central to the human experience but whether it corresponds to an independent metaphysical reality remains contested.

Unlike philosophers, cosmologists probably don’t much dwell on the nature of “now” because they have the “Andromeda paradox” which is one of the consequences of Albert Einstein’s (1879-1955) theory of special relativity.  What the paradox does is illustrate the way “now” is relative and differs for observers moving at different speeds, the effect increasing as distances increase, such as when the point of reference is the Andromeda galaxy, some 2½ million light years distant from Earth.  Under special relativity, what one observer sees and perceives as “now” on Andromeda will, by another, moving at a different relative speed, will perceive as occurring in the past or future.   This can happen at any distance but, outside of computer simulations or laboratories, the effects of relative simultaneity is noticeable (even for relatively slow speeds) only at distance. 

Seated vis-a-vis (literally "face to face"), Lindsay Lohan (b 1986, right) and her sister Aliana (b 1993, left), enjoying a tête-à-tête (literally, "head to head"), La Conversation bakery "& café, West Hollywood, California, April 2012.  Sadly, La Conversation is now closed.

Among the implications of the Andromeda paradox is that although the sisters would have thought their discussion something in the "here and now", to a cosmologist they are looking at each other as they used to be and hearing what each said some time in the past, every slight movement affecting the extent of this.  Because, in a sense, everything in the universe is happening "at the same time", the pair could have been sitting light years apart and spoke what they spoke "at the same time" but because of the speed at which light and sound travel, it's only at a certain distance a "practical" shared "now" becomes possible.  

Thursday, September 19, 2024

Evil

Evil (pronounced ee-vuhl)

(1) Morally wrong or bad; immoral; wicked; morally corrupt.

(2) Harmful; injurious (now rare).

(3) Marked or accompanied by misfortune (now rare; mostly historic).

(4) Having harmful qualities; not good; worthless or deleterious (obsolete).

Pre 900: From the Middle English evel, ivel & uvel (evil) from the Old English yfel, (bad, vicious, ill, wicked) from the Proto-Germanic ubilaz.  Related were the Saterland Frisian eeuwel, the Dutch euvel, the Low German övel & the German übel; it was cognate with the Gothic ubils, the Old High German ubil, the German übel and the Middle Dutch evel and the Irish variation abdal (excessive).  Root has long been thought the primitive Indo-European hupélos (diminutive of hwep) (treat badly) which produced also the Hittite huwappi (to mistreat, harass) and huwappa (evil, badness) but an alternative view is a descent from upélos (evil; (literally "going over or beyond (acceptable limits)")) from the primitive Indo-European upo, up & eup (down, up, over).  Evil is a noun & adjective (some do treat it as a verb), evilness is a noun and evilly an adverb; the noun plural is evils.

Evil (the word) arrived early in English and endured.  In Old English and all the early Teutonic languages except the Scandinavian, it quickly became the most comprehensive adjectival expression of disapproval, dislike or disparagement.  Evil was the word Anglo-Saxons used to convey some sense of the bad, cruel, unskillful, defective, harm, crime, misfortune or disease.  The meaning with which we’re most familiar, "extreme moral wickedness" existed since Old English but did not assume predominance until the eighteenth century.  The Latin phrase oculus malus was known in Old English as eage yfel and survives in Modern English as “evil eye”.  Evilchild is attested as an English surname from the thirteenth century and Australian-born Air Chief Marshall Sir Douglas Evill (1892-1971) was head of the Royal Air Force (RAF) delegation to Washington during World War II (1939-1945).  Despite its utility, there’s probably no word in English with as many words of in the same vein without any being actually synonymous.  Consider: destructive, hateful, vile, malicious, vicious, heinous, ugly, bad, nefarious, villainous, corrupt, malefic, malevolent, hideous, wicked, harm, pain, catastrophe, calamity, ill, sinful, iniquitous, depraved, vicious, corrupt, base, iniquity & unrighteousness; all tend in the direction yet none quite matches the darkness of evil although malefic probably come close.  

Hannah Arendt and the banality of evil

The word evil served English unambiguously and well for centuries and most, secular and spiritual, knew that some people are just evil.  It was in the later twentieth century, with the sudden proliferation of psychologists, interior decorators, sociologists, criminologists, social workers and basket weavers that an industry developed exploring alternative explanations and causations for what had long been encapsulated in the word evil.  The output was uneven but among the best remembered, certainly for its most evocative phrase, was in the work of German-American philosopher and political theorist Hannah Arendt (1906–1975).  Arendt’s concern, given the scale of the holocaust was: "Can one do evil without being evil?"

Whether the leading Nazis were unusually (or even uniquely) evil or merely individuals who, through a combination of circumstances, came to do awful things has been a question which has for decades interested psychiatrists, political scientists and historians.  Arendt attended the 1961 trial of Adolph Eichmann (1906-1962), the bureaucrat responsible for transportation of millions of Jews and others to the death camps built to allow the Nazis to commit the industrial-scale mass-murder of the final solution.  Arendt thought Eichmann ordinary and bland, “neither perverted nor sadistic” but instead “terrifyingly normal”, acting only as a diligent civil servant interested in career advancement, his evil deeds done apparently without ever an evil thought in his mind.  Her work was published as Eichmann in Jerusalem: A Report on the Banality of Evil (1963).  The work attracted controversy and perhaps that memorable phrase didn’t help.  It captured the popular imagination and even academic critics seemed seduced.  Arendt’s point, inter alia, was that nothing in Eichmann’s life or character suggested that had it not been for the Nazis and the notion of normality they constructed, he’d never have murdered even one person.  The view has its flaws in that there’s much documentation from the era to prove many Nazis, including Eichmann, knew what they were doing was a monstrous crime so a discussion of whether Eichmann was immoral or amoral and whether one implies evil while the other does not does, after Auschwitz, seems a sterile argument.

Evil is where it’s found.

Hannah Arendt's relationship with Martin Heidegger (1889–1976) began when she was a nineteen year old student of philosophy and he her professor, married and aged thirty-six.  Influential still in his contributions to phenomenology and existentialism, he will forever be controversial because of his brief flirtation with the Nazis, joining the party and taking an academic appointment under Nazi favor.  He resigned from the post within a year and distanced himself from the party but, despite expressing regrets in private, never publicly repented.  His affair with the Jewish Arendt is perhaps unremarkable because it pre-dated the Third Reich but what has always attracted interest is that their friendship lasted the rest of their lives, documented in their own words in a collection of their correspondence (Letters: 1925-1975, Hannah Arendt & Martin Heidegger (2003), Ursula Ludz (Editor), Andrew Shields (Translator)).  Cited sometimes as proof that feelings can transcend politics (as if ever there was doubt), the half-century of letters which track the course of a relationship which began as one of lovers and evolved first into friendship and then intellectual congress.  For those who wish to explore contradiction and complexity in human affairs, it's a scintillating read.  Arendt died in 1975, Heidegger surviving her by some six months.

New York Post, November 1999.

In 1999, Rupert Murdoch’s (b 1931) tabloid the New York Post ran one of their on-line polls, providing a list of the usual suspects, asking readers to rate the evil to most evil, so to determine “The 25 most evil people of the last millennium” and, predictably, Adolf Hitler (1889-1945; Führer (leader) and German head of government 1933-1945 & head of state 1934-1945) was rated the worst.  The poll received 19184 responses which revealed some “recency bias” (a cognitive bias that favors recent events over historic ones) in that some US mass-murderers were rated worse than some with more blood on their hands but most commented on was the stellar performance of the two “write-ins”: Bill Clinton (b 1946; US president 1993-2001) & his loyal wife, crooked Hillary Clinton (b 1947; US secretary of state 2009-2013), the POTUS coming second and the FLOTUS an impressive sixth, the Post's reader's rating both more evil than Saddam Hussein (1937–2006; president of Iraq 1979-2003), Vlad the Impaler (Vlad Dracula or Prince Vlad III of Wallachia (circa 1430-circa 1477); thrice Voivode of Wallachia 1448-circa 1477 or Ivan the Terrible (Ivan IV Vasilyevich (1530–1584; Grand Prince of Moscow and all Russia 1533-1584 & Tsar of all Russia 1547-1584).  Still, by a small margin (8.67% of the vote against 8.47), Mr Murdoch's readers rated Hitler more evil than Bill Clinton so there was that.

Lindsay Lohan and her lawyer in court, Los Angeles, December 2011.

While fun and presumably an indication of something, on-line polls should not be compared with the opinion polls run by reputable universities or polling organizations, their attraction for editors looking for click-bait being they’re essentially free and provide a result, sometimes within a day, unlike conventional polls which can cost thousands or even millions depending on the sample size and duration of research.  The central problem with on-line polls is that responders are self-selected rather than coming from a cohort determined by a statistical method developed in the wake of the disastrously inaccurate results of a poll “predicting” national voting intentions in the 1936 presidential election.  The 1936 catchment had been skewered towards the upper-income quartile by being restricted to those who answered domestic telephone connections, the devices then rarely installed in lower-income households.  A similar phenomenon of bias is evident in the difference on-line responses to the familiar question: “Who won the presidential debate?”, the divergent results revealing more about the demographic profiles of the audiences of CBS, MSNBC, CNN, ABC & FoxNews than on-stage dynamics on-stage.

Especially among academics in the social sciences, there are many who object to the frequent, almost casual, use of “evil”, applied to figures as diverse as serial killers and those who use the “wrong” pronoun.  Rightly on not, academics can find “complexity” in what appears simple to most and don’t like “evil” because of the simple moral absolutism it implies, the suggestion certain actions or individuals are inherently or objectively wrong.  Academics call this “an over-simplification of complex ethical situations” and they prefer the nuances of moral relativism, which holds that moral judgments can depend on cultural, situational, or personal contexts.  The structuralist-behaviorists (a field still more inhabited than a first glance may suggest) avoid the word because it so lends itself to being a “label” and the argument is that labeling individuals as “evil” can be both an act of dehumanizing and something which reinforces a behavioral predilection, thereby justifying punitive punishment rather than attempting rehabilitation.  Politically, it’s argued, the “evil” label permits authorities to ignore or even deny allegedly causative factors of behavior such as poverty, mental illness, discrimination or prior trauma.  Despite the intellectual scepticism, the word “evil” does seem to exert a pull and its implications are such there's really no substitute if one is trying to say certain things.  In À la recherche du temps perdu (In Search of Lost Time (1913–1927)), Marcel Proust (1871-1922) left the oft-quoted passage: “Perhaps she would not have considered evil to be so rare, so extraordinary, so estranging a state, to which it was so restful to emigrate, had she been able to discern in herself, as in everyone, that indifference to the sufferings one causes, an indifference which, whatever other names one may give it, is the terrible and permanent form of cruelty. 

There are also the associative traditions of the word, the linkages to religion and the supernatural an important part of the West’s cultural and literary inheritance but not one universally treated as “intellectually respectable”.  Nihilists of course usually ignore the notion of evil and to the post-modernists it was just another of those “lazy” words which ascribed values of right & wrong which they knew were something wholly subjective, evil as context-dependent as anything else.  Interestingly, in the language of the polarized world of US politics, while the notional “right” (conservatives, MAGA, some of what’s left of the Republican Party) tends to label the notional “left” (liberals, progressives, the radical factions of the Democratic Party) as evil, the left seems to depict their enemies (they’re no longer “opponents”) less as “evil” and more as “stupid”.

The POTUS & the pontiff: Francis & Donald Trump (aka the lesser of two evils), the Vatican, May 2017.

Between the pontificates of Pius XI (1857–1939; pope 1922-1939) and  Francis (b 1936; pope since 2013), all that seems to have changed in the Holy See’s world view is that civilization has moved from being threatened by communism, homosexuality and Freemasony to being menaced by Islam, homosexuality and Freemasony.  It therefore piqued the interest of journalists accompanying the pope on his recent 12-day journey across Southeast Asia when they were told by a Vatican press secretary his Holiness would, during the scheduled press conference, discuss the upcoming US presidential election: duly, the scribes assembled in their places on the papal plane. The pope didn’t explicitly tell people for whom they should vote nor even make his preference obvious as Taylor Swift (b 1989) would in her endorsement mobilizing the childless cat lady vote but he did speak in an oracular way, critiquing both Kamala Harris (b 1964; US vice president since 2021) and Donald Trump (b 1946; US president 2017-2021) as “against life”, urging Catholic voters to choose the “lesser of two evils.”  That would have been a good prelude had he gone further but there he stopped: “One must choose the lesser of two evils. Who is the lesser of two evils?  That lady or that gentleman? I don’t know.

Socks (1989-2009; FCOTUS (First Cat of the United States 1993-2001)) was Chelsea Clinton's (b 1980; FDOTUS (First Daughter of the United States 1993-2001)) cat.  Cartoon by Pat Oliphant, 1996.

The lesser of two evils: Australian-born US political cartoonist Pat Oliphant’s (b 1935) take on the campaign tactics of Bill Clinton (b 1946; US president 1993-2001) who was the Democratic Party nominee in the 1996 US presidential election against Republican Bob Dole (1923–2021).  President Clinton won by a wide margin which would have been more handsome still, had there not been a third-party candidate.  Oliphant’s cartoons are now held in the collection of the National Library of Congress.  It’s not unusual for the task presented to voters in US presidential elections to be reduced to finding “the lesser of two evils”.  In 1964 when the Democrats nominated Lyndon Johnson (LBJ, 1908–1973; US president 1963-1969) to run against the Republican's Barry Goldwater (1909–1998), the conclusion of many was it was either “a crook or a kook”.  On the day, the lesser of the two evils proved to be crooked old Lyndon who won in a landslide over crazy old Barry.

Francis has some history in criticizing Mr Trump’s handling of immigration but the tone of his language has tended to suggest he’s more disturbed by politicians who support the provision of abortion services although he did make clear he sees both issues in stark moral terms: “To send migrants away, to leave them wherever you want, to leave them… it’s something terrible, there is evil there. To send away a child from the womb of the mother is an assassination, because there is life. We must speak about these things clearly.  Francis has in the past labelled abortion a “plague” and a “crime” akin to “mafia” behavior, although he did resist suggestions the US bishops should deny Holy Communion to “pro-choice” politicians (which would have included Joe Biden (b 1942; US president 2021-2025), conscious no doubt that accusations of being an “agent of foreign interference” in the US electoral process would be of no benefit.  Despite that, he didn’t seek to prevent the bishops calling abortion is “our preeminent priority” in Forming Consciences for Faithful Citizenship, the 2024 edition of their quadrennial document on voting.  Some 20% of the US electorate describe themselves as Catholics, their vote in 2020 splitting 52/47% Biden/Trump but that was during the Roe v Wade (1973) era and abortion wasn’t quite the issue it's since become and surveys suggest a majority of the faith believe it should be available with only around 10% absolutist right-to-lifers.  Analysts concluded Francis regards Mr Trump as less evil than Ms Harris and will be pleased if his flock votes accordingly; while he refrained from being explicit, he did conclude: “Not voting is ugly.  It is not good.  You must vote.

Wednesday, June 12, 2024

Reduction

Reduction (pronounced ri-duhk-shuhn)

(1) The act of reducing or the state of being reduced.

(2) The amount by which something is reduced or diminished.

(3) The form (result) produced by reducing a copy on a smaller scale (including smaller scale copies).

(4) In cell biology, as meiosis, especially the first meiotic cell division in which the chromosome number is reduced by half.

(5) In chemistry, the process or result of reducing (a reaction in which electrons are gained and valence is reduced; often by the removal of oxygen or the addition of hydrogen).

(6) In film production when using physical film stock (celluloid and such), the process of making a print of a narrower gauge from a print of a wider gauge (historically from 35 to 16 mm).

(7) In music, a simplified form, typically an arrangement for a smaller number of parties  such as an orchestral score arranged for a solo instrument.

(8) In computability theory, a transformation of one problem into another problem, such as mapping reduction or polynomial reduction.

(9) In philosophy (notably in phenomenology), a process intended to reveal the objects of consciousness as pure phenomena.

(10) In metalworking, the ratio of a material's change in thickness compared to its thickness prior to forging and/or rolling.

(11) In engineering, (usually as “reduction gear”), a means of energy transmission in which the original speed is reduced to whatever is suitable for the intended application.

(12) In surgery, a procedure to restore a fracture or dislocation to the correct alignment, usually with a closed approach but sometimes with an open approach.

(13) In mathematics, the process of converting a fraction into its decimal form or the rewriting of an expression into a simpler form.

(14) In cooking, the process of rapidly boiling a sauce to concentrate it.

(15) During the colonial period, a village or settlement of Indians in South America established and governed by Spanish Jesuit missionaries.

1475–1485: From the Middle English reduccion, from the earlier reduccion, from the Middle French reduction, from the Latin reductiōnem & reductiōn- (stem of reductiō (a “bringing back”)) the construct being reduct(us) (past participle of redūcere (to lead back) + -iōn- (the noun suffix).  The construct in English was thus reduc(e), -ion.  Reduce was from the Middle English reducen, from the Old French reduire, from the Latin redūcō (reduce), the construct being re- (back) + dūcō (lead).  The –ion suffix was from the Middle English -ioun, from the Old French -ion, from the Latin -iō (genitive -iōnis).  It was appended to a perfect passive participle to form a noun of action or process, or the result of an action or process.  Reduction, reductivism, reductionistic & reductionism are nouns, reductionist is a noun & adjective, reductional & reductive are adjectives; the noun plural is reductions.  Forms like anti-reduction, non-reduction, over-reduction, pre-reduction, post-reduction, pro-reduction, self-reduction have been created as required.

Actor Ariel Winter (b 1998), before (left) and after (right) mammaplasty (breast reduction).  Never has satisfactorily it been explained why this procedure seems to be lawful in all jurisdictions.

In philosophy & science, reductionism is an approach used to explain complex phenomena by reducing them to their simpler, more fundamental components.  It posits that understanding the parts of a system and their interactions can provide a complete explanation of the system as a whole an approach which is functional and valuable is some cases and to varying degrees inadequate in others.  The three generally recognized classes of reductionism are (1) Ontological Reductionism, the idea that reality is composed of a small number of basic entities or substances, best illustrated in biology where life processes are explained by reducing things to the molecular level.  (2) Methodological Reductionism, an approach which advocates studying systems by breaking into their constituent parts, much used in psychology where it might involve studying human behavior by examining neurological processes.  (3) Theory Reductionism which involves explaining a theory or phenomenon in one field by the principles of another, more fundamental field as when chemistry is reduced to the physics or chemical properties explained by the operation of quantum mechanics.  Reduction has been an invaluable component in many of the advances in achieved in science in the last two-hundred-odd years and some of the process and mechanics of reductionism have actually been made possible by some of those advances.  The criticism of an over-reliance on reductionism in certain fields in that its very utility can lead to the importance of higher-level structures and interactions being overlooked; there is much which can’t fully be explained by the individual parts or even their interaction.  The diametric opposite of reductionism is holism which emphasizes the importance of whole systems and their properties that emerge from the interactions between parts.  In philosophy, reductionism is the position which holds a system of any level of complexity is nothing but the sum of its parts and an account of it can thus be reduced to accounts of individual constituents.  It’s very much a theoretical model to be used as appropriate rather than an absolutist doctrine but it does hold that phenomena can be explained completely in terms of relations between other more fundamental phenomena: epiphenomena.  A reductionist is either (1) an advocate of reductionism or (2) one who practices reductionism.

Reductionism: Lindsay Lohan during "thin phase".

The adjective reductive has a special meaning in Scots law pertaining to reduction of a decree or other legal device (ie something rescissory in its effect); dating from the sixteenth century, it’s now rarely invoked.  In the sense of “causing the physical reduction or diminution of something” it’s been in use since the seventeenth century in fields including chemistry, metallurgy, biology & economics, always to convey the idea of reduces a substance, object or some abstract quantum to a lesser, simplified or less elaborated form.  At that time, it came to be used also to mean “that can be derived from, or referred back to; something else” and although archaic by the early 1800s, it existence in historic texts can be misleading.  It wasn’t until after World War II (1939-1945) that reductive emerged as a derogatory term, used to suggest an argument, issue or explanation has been “reduced” to a level of such simplicity that so much has been lost as to rob things of meaning.  The phrase “reductio ad absurdum” (reduction to the absurd) is an un-adapted borrowing from the Latin reductiō ad absurdum, and began in mathematics, logic (where it was a useful tool in deriving proofs in fields like).  In wider use, it has come to be used of a method of disproving a statement by assuming the statement is true and, with that assumption, arriving at a blatant contradiction; the synonyms are apagoge & “proof by contradiction”.

Single-family houses (D-Zug) built in 1922 on the principle of architectural reductionism by Heinrich Tessenow in collaboration with Austrian architect Franz Schuster (1892–1972), Moritzburger Weg 19-39 (the former Pillnitzer Weg), Gartenstadt Hellerau, Dresden, Germany.

As a noun, a reductivist is one who advocates or adheres to the principles of reductionism or reductivism.  In art & architecture (and some aspects of engineering) this can be synonymous with the label “a minimalist” (one who practices minimalism).  As an adjective, reductivist (the comparative “more reductivist”, the superlative “most reductivist”) means (1) tending to reduce to a minimum or to simplify in an extreme way and (2) belonging to the reductivism movement in art or music.  The notion of “extreme simplification” (a reduction to a minimum; the use of the fewest essentials) has always appealed some and appalled others attracted to intricacy and complexity.  The German architect Professor Heinrich Tessenow (1876-1950) summed it up in the phrase for which he’s remembered more than his buildings: “The simplest form is not always the best, but the best is always simple.”, one of those epigrams which may not reveal a universal truth but is probably a useful thing to remind students of this and that lest they be seduced by the process and lose sight of the goal.  Tessenow was expanding on the principle of Occam's Razor (the reductionist philosophic position attributed to English Franciscan friar & theologian William of Ockham (circa 1288–1347) written usually as Entia non sunt multiplicanda praeter necessitatem (literally "Entities must not be multiplied beyond necessity" which translates best as “the simplest solution is usually the best.

Reductio in extrema

1960 Lotus Elite Series 1 (left) and at the Le Mans 24 Hour endurance classic, June 1959 (left) Lotus Elite #41 leads Ferrari 250TR #14. The Ferrari (DNF) retired after overheating, the Elite finishing eighth overall, winning the 1.5 litre GT class.

Weighing a mere 500-odd kg (1100 lb), the early versions of the exquisite Lotus Elite (1957-1963) enchanted most who drove it but the extent of the reductionism compromised the structural integrity and things sometimes broke when used under everyday conditions which of course includes potholed roads.  Introduced late in 1961 the Series 2 Elite greatly improved this but some residual fragility was inherent to the design.  On the smooth surfaces of racing circuits however, it enjoyed an illustrious career, notable especially for success in long-distance events at the Nürburgring and Le Mans.  The combination of light weight and advanced aerodynamics meant the surprisingly powerful engine (a lightweight and robust unit which began life powering the water pumps of fire engines!) delivered outstanding performance, frugal fuel consumption and low tyre wear.  As well as claiming five class trophies in the Le Mans 24 hour race, the Elite twice won the mysterious Indice de performance (an index of thermal efficiency), a curious piece of mathematics actually intended to ensure, regardless of other results, a French car would always win something.

Colin Chapman (1928–1982), who in 1952 founded Lotus Cars, applied reductionism even to the Tessenow mantra in his design philosophy: “Simplify, then add lightness.”  Whether at the drawing board, on the factory floor or on the racetrack, Chapman seldom deviated from his rule and while it lent his cars sparking performance and delightful characteristics, more than one of the early models displayed an infamous fragility.  Chapman died of a heart attack which was a good career move, given the likely legal consequences of his involvement with John DeLorean (1925–2005) and the curious financial arrangements made with OPM (other people's money) during the strange episode which was the tale of the DMC DeLorean gullwing coupé.

1929 Mercedes-Benz SSKL blueprint (recreation, left) and the SSKL “streamliner”, AVUS, Berlin, May 1932 (right).

The Mercedes-Benz SSKL was one of the last of the road cars which could win top-line grand prix races.  An evolution of the earlier S, SS and SSK, the SSKL (Super Sports Kurz (short) Leicht (light)) was notable for the extensive drilling of its chassis frame to the point where it was compared to Swiss cheese; reducing weight with no loss of strength.  The SSK had enjoyed success in competition but even in its heyday was in some ways antiquated and although powerful, was very heavy, thus the expedient of the chassis-drilling intended to make it competitive for another season.  Lighter (which didn't solve but at least to a degree ameliorated the high tyre wear) and easier to handle than the SSK (although the higher speed brought its own problems, notably in braking), the SSKL enjoyed a long Indian summer and even on tighter circuits where its bulk meant it could be out-manoeuvred, sometimes it still prevailed by virtue of sheer power.  By 1932 however the engine’s potential had been reached and no more metal could be removed from the structure without dangerously compromising safety; in engineering (and other fields), there is a point at which further reduction becomes at least counter-productive and often dangerouw.  The solution was an early exercise in aerodynamics (“streamlining” the then fashionable term), an aluminium skin prepared for the 1932 race held on Berlin’s AVUS (Automobil-Versuchs und Übungsstraße (automobile traffic and practice road)).  The reduction in air-resistance permitted the thing to touch 255 km/h (158 mph), some 20 km/h (12 mph) more than a standard SSLK, an increase the engineers calculated would otherwise have demanded another (unobtainable) 120 horsepower.  The extra speed was most useful at the unique AVUS which comprised two straights (each almost six miles (ten kilometres) in length) linked by two hairpin curves, one a dramatic banked turn.  The SSKL was the last of the breed, the factory’s subsequent Grand Prix machines all specialized racing cars.

Reduction gears: Known casually as "speed reducers", reduction gears are widely used in just about every type of motor and many other mechanical devices.  What they do is allow the energy of a rotating shaft to be transferred to another shaft running at a reduced speed (achieved usually by the use of gears (cogs) of different diameters.

In chemistry, a reduction is the process or result of reducing (a reaction in which electrons are gained and valence is reduced; often by the removal of oxygen or the addition of hydrogen) and as an example, if an iron atom (valence +3) gains an electron, the valence decreases to +2.  Linguistically, it’s obviously counterintuitive to imagine a “reduced atom” is one which gains rather than loses electrons but the term in this context dates from the early days of modern chemistry, where reduction (and its counterpart: “oxidation”) were created to describe reactions in which one substance lost an oxygen atom and the other substance gained it.   In a reaction such as that between two molecules of hydrogen (2H2)and one of oxygen (O2) combining to produce two molecules of water (2H2O), the hydrogen atoms have gained oxygen atoms and were said to have become “oxidized,” while the oxygen atoms have “lost them” by attaching themselves to the hydrogens, and were thus “reduced”.  Chemically however, in the process of gaining an oxygen atom, the hydrogen atoms have had to give up their electrons and share them with the oxygen atoms, while the oxygen atoms have gained electrons, thus the seeming paradox that the “reduced” oxygen has in fact gained something, namely electrons.

Secretary of Defence the younger (left) and elder (right).  Donald Rumsfeld (left) with Gerald Ford (1913–2006; US president 1974-1977) and George W Bush (George XLIII, b 1946; US president 2001-2009).

Donald Rumsfeld (1932–2021; US Secretary of Defense 1975-1977 & 2001-2006) may or may not have been evil but his mind could sparkle and his marvellously reductionist principles can be helpful.  His classification of knowledge was often derided but it remains a useful framework:

(1) Known unknowns.
(2) Known knowns.
(3) Unknown unknowns.
(4) (most intriguingly) Unknown knowns.

A expert reductionist, he reminded us also there are only three possible answers to any question and while there's a cultural reluctance to say “don’t know”, sometimes it is best:

(1) I know and I’m going to tell you.
(2) I know and I’m not going to tell you.
(3) Don’t know.

While (1) known unknowns, (2) known knowns and (3) unknown unknowns are self-explanatory, examples of (4) unknown knowns are of interest and a classic one was the first “modern” submarine, developed by the Germans during the last months of World War II (1939-1945).

German Type XII Elektroboot (1945).

In World War II, the course of the war could have been very different had OKM (Oberkommando der Marine (the Kriegsmarine's (German Navy) high command)) followed the advice of the commander of the submarines and made available a fleet of 300 rather than building a surface fleet which wasn’t large enough to be a strategic threat but of sufficient size to absorb resources which, if devoted to submarines, could have been militarily effective.  With a fleet of 300, it would have been possible permanently to maintain around 100 at sea but at the outbreak of hostilities, only 57 active boats were on the navy’s list, not all of which were suitable for operations on the high seas so in the early days of the conflict, it was rare for the Germans to have more than 12 committed to battle in the Atlantic.  Production never reached the levels necessary for the numbers to achieve critical mass but even so, in the first two-three years of the war the losses sustained by the British were considerable and the “U-Boat menace” was such a threat that much attention was devoted to counter-measures and by 1943 the Allies could consider the battle of the Atlantic won.  The Germans’ other mistake was not building a true submarine capable of operating underwater (and therefore undetected) for days at a time.

It was only in 1945 when Albert Speer (1905–1981; Nazi court architect 1934-1942; Nazi minister of armaments and war production 1942-1945) and Grand Admiral Karl Dönitz (1891–1980; head of the German Navy 1943-1945, German head of state 1945) were assessing the “revolutionary” new design that they concluded there was no reason why such craft couldn’t have been built in the late 1930s because the engineering capacity and technology existed even then (although the industrial and labor resources did not).  It was a classic case of what Donald Rumsfeld (1932–2021: US defense secretary 1975-1977 & 2001-2006) would later call an “unknown known”: The Germans in 1939 knew how to build a modern submarine but didn’t “know that they knew”.  Despite the improvements however, military analysts have concluded that even if deployed in numbers, such was the strength of forces arrayed against Nazi Germany that by 1945, not even such a force could have been enough to turn the tide of war.  However, had the German navy in 1939-1940 had available a fleet of even 100 such submarines (about a third what OKM (Oberkommando der Marine (the Kriegsmarine's (German Navy) high command) calculated was the critical strategic size given at any point only a third would be at sea with the others either in transit or docked), the battle in the Atlantic would have been much more difficult for the British.

Mr Rumsfeld however neglected to mention another class of knowledge: the “lost known”, examples of which have from time-to-time appeared and there may be more still to be discovered.  The best known were associated with the knowledge lost after the fall in the fifth century of the Western Roman Empire when Europe entered the early medieval period, once popularly known as the “Dark Ages”.  The lost knowns included aspects of optics such as lens grinding and the orthodoxy long was the knowledge was not “re-discovered” or “re-invented” until perfected in Italy during the late thirteenth century although it’s now understood that in the Islamic world lens continued during the late Medieval period to be ground and it’s suspected it was from Arabic texts the information reached Europe.

What really was a lost known was how the Romans of Antiquity made their long-lasting opus caementicium (concrete) so famously “sticky” and resistant to the effects of salt water.  Unlike modern concrete, made using Portland cement & water, Roman concrete was a mix of volcanic ash & lime, mixed with seawater, the later ingredient inducing a chemical reaction creating a substance stronger and more durable.  When combined with volcanic rocks, it formed mineral crystalline structures called aluminum tobermorite which lent the mix great strength and resistance to cracking.  After the fall of Rome, the knowledge was lost and even when a tablet was discovered listing the mix ratios, caementicium couldn’t be replicated because the recipe spoke only of “water” and not “sea water”, presumably because that was common knowledge.  It was only modern techniques of analysis which allowed the “lost known” to again become a “known known”.