Showing posts sorted by date for query Phenomenology. Sort by relevance Show all posts
Showing posts sorted by date for query Phenomenology. Sort by relevance Show all posts

Saturday, January 10, 2026

Aphantasia

Aphantasia (pronounced ay-fan-tay-zhuh)

The inability voluntarily to recall or form mental images.

2015: The word (not the diagnosis) was coined by UK neurologist Dr Adam Zeman (b 1957), neuropsychologist Dr Michaela Dewar (b 1976) and Italian neurologist Sergio Della Sala (b 1955), first appearing the paper Lives without imagery.  The construct was a- (from the Ancient Greek ἀ- (a-), used as a privative prefix meaning “not”, “without” or “lacking” + phantasía (from the Greek φαντασία (“appearance”, “imagination”, “mental image” or “power of imagination”, from φαίνω (phaínō) ( “to show”, “to make visible” or “to bring to light”).  Literally, aphantasia can be analysed as meaning “an absence of imagination” or “an absence of mental imagery” and in modern medicine it’s defined as “the inability voluntarily to recall or form mental images”.  Even in Antiquity, there was some meaning shift in phantasía, Plato (circa 427-348 BC) using the word to refer generally to representations and appearances whereas Aristotle (384-322 BC) added a technical layer, his sense being faculty mediating between perception (aisthēsis) and thought (noēsis).  It’s the Aristotelian adaptation (the mind’s capacity to form internal representations) which flavoured the use in modern neurology.  Aphantasia is a noun and aphantasic is a noun & adjective; the noun plural is aphantasics.

Scuola di Atene (The School of Athens, circa 1511), fresco by Raphael (Raffaello Sanzio da Urbino, 1483–1520), Apostolic Palace, Vatican Museums, Vatican City, Rome.  Plato and Aristotle are the figures featured in the centre.

In popular use, the word “aphantasia” can be misunderstood because of the paths taken in English by “phantasy”, “fantasy” and “phantasm”, all derived from the Ancient Greek φαντασία (phantasía) meaning “appearance, mental image, imagination”.  In English, this root was picked up via Latin and French but the multiple forms each evolved in distinct semantic trajectories.  The fourteenth century phantasm came to mean “apparition, ghost, illusion” so was used of “something deceptive or unreal”, the connotation being “the supernatural; spectral”.  This appears to be the origin of the association of “phantas-” with unreality or hallucination rather than normal cognition.  In the fifteenth & sixteenth centuries, the spellings phantasy & fantasy were for a time interchangeable although divergence came with phantasy used in its technical senses of “mental imagery”; “faculty of imagination”; “internal representation”, this a nod to Aristotle’s phantasía.  Fantasy is the familiar modern form, used to suggest “a fictional invention; daydream; escapism; wish-fulfilment, the connotation being “imaginative constructions (in fiction); imaginative excess (in the sense of “unreality” or the “dissociative”); indulgence (as in “speculative or wishful thoughts”)”.

While the word “aphantasia” didn’t exist until 2015, in the editions of the American Psychiatric Association's (APA) Diagnostic and Statistical Manual of Mental Disorders (DSM) published between 1952 (DSM-I) and 2013 (DSM-5), there was not even any any discussion (or even mention) of a condition anything like “an inability voluntarily to recall or form mental images”.  That’s because despite being “a mental condition” induced by something happening (or not happening) in the brain, the phenomenon has never been classified as “a mental disorder”.  Instead it’s a cognitive trait or variation in the human condition and technically is a spectrum condition, the “pure” aphantasic being one end of the spectrum, the hyperaphantasic (highly vivid, lifelike mental imagery, sometimes called a “photorealistic mind's eye”) the other.  That would of course imply the comparative adjective would be “more aphantasic” and the superlative “most aphantasic” but neither are standard forms.

If That rationale for the “omission” was the DSM’s inclusion criteria including the requirement of some evidence of clinically significant distress or functional impairment attributable to a condition.  Aphantasia, in isolation, does not reliably meet this threshold in that many individuals have for decades functioned entirely “normally” without being aware they’re aphantasic  while others presumably had died of old age in similar ignorance.  That does of course raise the intriguing prospect the mental health of some patients may have been adversely affected by the syndrome only by a clinician informing them of their status, thus making them realize what they were missing.  This, the latest edition of the DSM (DSM-5-TR (2022)) does not discuss.  The DSM does discuss imagery and perceptual phenomena in the context of other conditions (PTSD (post-traumatic stress disoder), psychotic disorders, dissociative disorders etc), but these references are to abnormal experiences, not the lifelong absence of imagery.  To the DSM’s editors, aphantasis remains a recognized phenomenon, not a diagnosis.

Given that aphantasia concerns aspects of (1) cognition, (2) inner experience and (3) mental representation, it wouldn’t seem unreasonable to expect the condition now described as aphantasia would have appeared in the DSM, even if only in passing or in a footnote.  However, in the seventy years between 1952-2022, over nine editions, there is no mention, even in DSM-5-TR (2022), the first volume released since the word was in 2015 coined.  That apparently curious omission is explained by the DSM never having been a general taxonomy of mental phenomena.  Instead, it’s (an ever-shifting) codification of the classification of mental disorders, defined by (1) clinically significant distress and/or (2) functional impairment and/or (3) a predictable course, prognosis and treatment relevance.  As a general principle the mere existence of an aphantasic state meets none of these criteria.

Crooked Hillary Clinton in Orange Nina McLemore pantsuit, 2010.  If in response to the prompt "Imagine a truthful person" one sees an image of crooked Hillary Clinton (b 1947; US secretary of state 2009-2013), that obviously wrong but is not an instance of aphantasia because the image imagined need not be correct, it needs just to exist.

The early editions (DSM-I (1952) & DSM-II (1968)) heavily were slanted to the psychoanalytic, focusing on psychoses, neuroses and personality disorders with no mention of any systematic treatment of cognition as a modular function; the matter of mental imagery (even as abstract though separated from an imagined image), let alone its absence, wholly is ignored.  Intriguingly, given what was to come in the field, there was no discussion of the cognitive phenomenology beyond gross disturbances (ie delusions & hallucinations).  Even with the publication of the DSM-III (1980) & DSM-III-R (1987), advances in scanning and surgical techniques, cognitive psychology and neuroscience seem to have made little contribution to what the DSM’s editorial board decided to include and although DSM-III introduced operationalized diagnostic criteria (as a part of a more “medicalised” and descriptive psychiatry), the entries still were dominated by a focus on dysfunctions impairing performance, the argument presumably that it was possible (indeed, probably typical) for those with the condition to lead, full, happy lives; the absence of imagery ability thus not considered a diagnostically relevant variable.  Even in sections on (1) amnestic disorders (a class of memory loss in which patients have difficulty forming new memories (anterograde) or recalling past ones (retrograde), not caused by dementia or delirium but of the a consequence of brain injury, stroke, substance abuse, infections or trauma), with treatment focusing on the underlying cause and rehabilitation, (2) organic mental syndromes or (3) neuro-cognitive disturbance, there was no reference to voluntary imagery loss as a phenomenon in its own right.

Although substantial advances in cognitive neuroscience meant by the 1990s neuropsychological deficits were better recognised, both the DSM-IV (1994) and DSM-IV-TR (2000) continued to be restricted to syndromes with behavioural or functional consequences.  In a way that was understandable because the DSM still was seen by the editors as a manual for working clinicians who were most concerned with helping those afflicted by conditions with clinical salience; the DSM has never wandered far into subjects which might be matters of interesting academic research and mental imagery continued to be mentioned only indirectly, hallucinations (percepts without stimuli) and memory deficits (encoding and retrieval) both discussed only in the consequence of their affect on a patient, not as phenomenon.  The first edition for the new century was DSM-5 (2013) and what was discernible was that discussions of major and mild neuro-cognitive disorders were included, reflecting the publication’s enhanced alignment with neurology but even then, imagery ability is not assessed or scaled: not possessing the power of imagery was not listed as a symptom, specifier, or associated feature.  So there has never in the DSM been a category for benign cognitive variation and that is a product of a deliberate editorial stance rather than an omission, many known phenomenon not psychiatrised unless in some way “troublesome”.

The term “aphantasia” was coined to describe individuals who lack voluntary visual mental imagery, often discovered incidentally and not necessarily associated with brain injury or psychological distress.  In 2015 the word was novel but the condition had been documented for more than a century, Sir Francis Galton (1822–1911) in a paper published in 1880 describing what would come to be called aphantasia.  That work was a statistical study on mental imagery which doubtless was academically solid but Sir Francis’s reputation later suffered because he was one of the leading lights in what was in Victorian times (1837-1901) the respectable discipline of eugenics.  Eugenics rightly became discredited so Sir Francis was to some extent retrospectively “cancelled” (something like the Stalinist concept of “un-personing”) and these days his seminal contribution to the study of behavioural genetics is acknowledged only grudgingly.

Galton in 1880 noted a wide variation in “visual imagination” (ie it was understood as a “spectrum condition”) and in the same era, in psychology publications the preferred term seems to have been “imageless thought”.  In neurology (and trauma medicine generally) there were many reports of patients losing the power of imagery after a brain injury but no agreed name was ever applied because the interest was more in the injury.  The unawareness that some people simply lacked the facility presumably must have been held among the general population because as Galton wrote: “To my astonishment, I found that the great majority of the men of science to whom I first applied, protested that mental imagery was unknown to them, and they looked on me as fanciful and fantastic in supposing that the words “mental imagery” really expressed what I believed everybody supposed them to mean. They had no more notion of its true nature than a colour-blind man who has not discerned his defect has of the nature of colour.

His paper must have stimulated interest because one psychologist reported some subjects possessing what he called a “typographic visual type” imagination in which ideas (which most would visualize as an image of some sort) would manifest as “printed text” which was intriguing because in the same way a computer in some aspects doesn’t distinguish between an image file (jpeg, TIFF, webp, avif etc) which is a picture of (1) someone and (2) their name in printed form, that would seem to imply at least some who are somewhere on the aphantasia spectrum retain the ability to visualize printed text, just not the object referenced.  Professor Zeman says he first became aware of the condition in 2005 when a patient reported having lost the ability to visualize following minor surgery and after the case was in 2010 documented in the medical literature in the usual way, it provoked a number of responses in which multiple people informed Zeman they had never in their lifetime been able to visualize objects.  This was the origin of Zeman and his collaborators coining “congenital aphantasia”, describing individuals who never enjoyed the ability to generate voluntary mental images.  Because it was something which came to general attention in the age of social media, great interest was triggered in the phenomenon and a number of “on-line tests” were posted, the best-known of which was the request for readers to “imagine a red apple” and rate their “mind's eye” depiction of it on a scale from 1 (photorealistic visualisation) through to 5 (no visualisation at all).  For many, this was variously (1) one’s first realization they were aphantasic or (2) an appreciation one’s own ability or inability to visualise objects was not universal.

How visualization can manifest: Lindsay Lohan and her lawyer in court, Los Angeles, December. 2011.  If an aphantasic person doesn't know about aphantasia and doesn't know other people can imagine images, their lives are probably little different from them; it's just their minds have adapted to handle concepts in another way.

Top right: What’s thought “normal” visualization (thought to be possessed by most of the population) refers to the ability to imagine something like a photograph of what’s being imagined.  This too is a spectrum condition in that some will be able to imagine an accurate “picture”, something like a HD (high definition photograph” while others will “see” something less detailed, sketchy or even wholly inaccurate.  However, even if when asked to visualize “an apple” one instead “sees a banana”, that is not an indication of aphantasia, a condition which describes only an absence of an image.  Getting it that wrong is an indication of something amiss but it’s not aphantasia.

Bottom left: “Seeing” text in response to being prompted to visualize something was the result Galton in 1880 reported as such a surprise.  It means the brain understands the concept of what is being described; it just can’t be imagined as an image.  This is one manifestation of aphantasia but it’s not related to the “everything is text” school of post-modernism.  Jacques Derrida’s (1930-2004) fragment “Il n'y a pas de hors-texte” (literally “there is no outside-text”) is one of the frequently misunderstood phrases from the murky field of deconstruction bit it has nothing to do with aphantasia (although dedicated post-modernists probably could prove a relationship).

Bottom right: The absence of any image (understood as a “blankness” which does not necessarily imply “whiteness” or “blackness” although this is the simple way to illustrate the concept), whether text or to some degree photorealistic is classic aphantasia.  The absence does not mean the subject doesn’t understand the relevant object of concept; it means only that their mental processing does not involve imagery and for as long as humans have existed, many must have functioned in this way, their brains adapted to the imaginative range available to them.  What this must have meant was many became aware of what they were missing only when the publicity about the condition appeared on the internet, am interesting example of “diagnostic determinism”.

WebMD's classic Aphantasia test.

The eyes are an out-growth of the brain and WebMD explains aphantasia is caused by the brain’s visual cortex (the part of the brain that processes visual information from the eyes) “working differently than expected”, noting the often quoted estimate of it affecting 2-4% of the population may be understated because many may be unaware they are “afflicted”.  It’s a condition worthy of more study because aphantasics handle the characteristic by processing information differently from those who rely on visual images.  There may be a genetic element in aphantasia and there’s interest too among those researching “Long Covid” because the symptom of “brain fog” can manifest much as does aphantasia.

Aphantasia may have something to do with consciousness because aphantasics can have dreams (including nightmares) which can to varying degrees be visually rich.  There’s no obvious explanation for this but while aphantasia is the inability voluntarily to generate visual mental imagery while awake, dreaming is an involuntary perceptual experience generated during sleep; while both are mediated by neural mechanisms, these clearly are not identical but presumably must overlap.  The conclusions from research at this stage remains tentative the current neuro-cognitive interpretation seems to suggest voluntary (conscious) imagery relies on top-down activation of the visual association cortex while dream (unconscious) dream imagery relies more on bottom-up and internally driven activation during REM (rapid eye movement) sleep.  What that would seem to imply is that in aphantasia, the former pathway is impaired (or at least inaccessible), while the latter may remain intact (or accessible).

The University of Queensland’s illustration of the phantasia spectrum.

The opposite syndrome is hyperphantasia (having extremely vivid, detailed, and lifelike mental imagery) which can be a wonderful asset but can also be a curse, rather as hyperthymesia (known also as HSAM (Highly Superior Autobiographical Memory) and colloquially as “total recall”) can be disturbing.  Although it seems not to exist in the sense of “remembering everything, second-by-second”, there are certainly those who have an extraordinary recall of “events” in their life and this can have adverse consequences for mental health because one of the mind’s “defensive mechanisms” is forgetting or at least suppressing memories which are unwanted.  Like aphantasia & hyperphantasia, hyperthymesia is not listed by the DSM as a mental disorder; it is considered a rare cognitive trait or neurological phenomenon although like the imaging conditions it can have adverse consequences and these include disturbing “flashbacks”, increased rumination and increased rates of anxiety or obsessive tendencies.

Wednesday, May 14, 2025

Psychache

Psychache (pronounced sahyk-eyk)

Psychological pain, especially when it becomes unbearable, producing suicidal thoughts.

1993: The construct was psyche- + ache.  Psychache was coined by US clinical psychologist Dr Edwin Shneidman (1918-2009) and first appeared in his book Suicide as Psychache: A Clinical Approach to Self-Destructive Behavior (1993).  The prefix psych- was an alternative form of psycho-.  Psycho was from the Ancient Greek ψχο- (psūkho-), a combining form of ψυχή (psukh) (soul).  Wit was used with words relating to the soul, the mind, or to psychology.  Ache was from the Middle English verb aken & noun ache (noun), from the Old English verb acan (from the Proto-West Germanic akan, from the Proto-Germanic akaną (to ache)) and the noun æċe (from the Proto-West Germanic aki, from the Proto-Germanic akiz), both from the primitive Indo-European heg- (sin, crime).  It was cognate with the Saterland Frisian eeke & ääke (to ache, fester), the Low German aken, achen & äken (to hurt, ache), the German Low German Eek (inflammation), the North Frisian akelig & æklig (terrible, miserable, sharp, intense), the West Frisian aaklik (nasty, horrible, dismal, dreary) and the Dutch akelig (nasty, horrible).  Historically the verb was spelled ake, and the noun ache but the spellings became aligned after Dr Johnson (Samuel Johnson (1709-1784)) published A Dictionary of the English Language (1755), the lexicographer mistakenly assuming it was from the Ancient Greek χος (ákhos) (pain) due to the similarity in form and meaning of the two words.  As a noun, ache meant “a continuous, dull pain (as opposed to a sharp, sudden, or episodic pain) while the verb was used to mean (1) to have or suffer a continuous, dull pain, (2) to feel great sympathy or pity and (3) to yearn or long for someone or something.  Pyscheache is a noun

Psychache is a theoretical construct used by clinical suicidologists and differs from psychomachia (conflict of the soul).  Psychomachia was from the Late Latin psӯchomachia, the title of a poem of a thousand-odd lines (circa 400) by Roman Christian poet Prudentius (Aurelius Prudentius Clemens; 348-circa 412), the construct being the Ancient Greek Greek psukhē (spirit) + makhē (battle).  The fifth century poem Psychomachia (translated usually as “Battle of Spirits” or “Soul War”) explored a theme familiar in Christianity: the eternal battle between virtue & vice (onto which can be mapped “right & wrong”, “good & evil” etc) and culminated in the forces of Christendom vanquishing pagan idolatry to the cheers of a thousand Christian martyrs.  An elegant telling of an allegory familiar in early Christian literature and art, Prudentius made clear the battle was one which happened in the soul of all people and thus one which all needed to wage, the outcome determined by whether the good or evil in them proved stronger.  The poem’s characters include Faith, Hope, Industry, Sobriety, Chastity, Humility & Patience among the good and Pride, Wrath, Paganism, Avarice, Discord, Lust & Indulgence in the ranks of the evil but scholars of literature caution that although the personifications all are women, in Latin, words for abstract concepts use the feminine grammatical gender and there’s nothing to suggest the poet intended us to read this as a tale of bolshie women slugging it out.  Of interest too is the appearance of the number seven, so familiar in the literature and art of Antiquity and the Medieval period as well as the Biblical texts but although Prudentius has seven virtues defeat seven vices, the characters don’t exactly align with either the canonical seven deadly sins, nor the three theological and four cardinal virtues.  In modern use, the linguistic similarity between psychache and psychomachia has made the latter attractive to those seduced by the (not always Germanic) tradition of the “romance of suicide”.

A pioneer in the field of suicidology, Dr Shneidman’s publication record was indicative of his specialization.

Dr Edwin Shneidman (1918-2009) was a clinical psychologist who practiced as a thanatologist (a practitioner in the field of thanatology (the scientific study of death and the practices associated with it, including the study of the needs of the terminally ill and their families); the construct of thanatology being thanato- (from the Ancient Greek θάνατος (thánatos) (death)) + -logy.  The suffix -ology was formed from -o- (as an interconsonantal vowel) + -logy.  The origin in English of the -logy suffix lies with loanwords from the Ancient Greek, usually via Latin and French, where the suffix (-λογία) is an integral part of the word loaned (eg astrology from astrologia) since the sixteenth century.  French picked up -logie from the Latin -logia, from the Ancient Greek -λογία (-logía).  Within Greek, the suffix is an -ία (-ía) abstract from λόγος (lógos) (account, explanation, narrative), and that a verbal noun from λέγω (légō) (I say, speak, converse, tell a story).  In English the suffix became extraordinarily productive, used notably to form names of sciences or disciplines of study, analogous to the names traditionally borrowed from the Latin (eg astrology from astrologia; geology from geologia) and by the late eighteenth century, the practice (despite the disapproval of the pedants) extended to terms with no connection to Greek or Latin such as those building on French or German bases (eg insectology (1766) after the French insectologie; terminology (1801) after the German Terminologie).  Within a few decades of the intrusion of modern languages, combinations emerged using English terms (eg undergroundology (1820); hatology (1837)).  In this evolution, the development may be though similar to the latter-day proliferation of “-isms” (fascism; feminism etc).

Death and the College Student: A Collection of Brief Essays on Death and Suicide by Harvard Youth (1973) by Dr Edwin Shneidman.  Dr Shneidman wrote many papers about the prevalence of suicide among college-age males, a cross-cultural phenomenon.

Dr Shneidman was one of the seminal figures in the discipline of suicidology, in 1968 founding the AAS (American Association of Suicidology) and the principal US journal for suicide studies: Suicide and Life-Threatening Behavior.  The abbreviation AAS is in this context used mostly within the discipline because (1) it is a specialized field and (2) there are literally dozens of uses of “AAS”.  In Suicide as Psychache: A Clinical Approach to Self-Destructive Behavior (1993) he defined psychache as “intense psychological pain—encompassing hurt, anguish, and mental torment”, identifying it as the primary motivation behind suicide, his theory being that when psychological pain becomes unbearable, individuals may perceive suicide as their only escape from torment.

Although since Suicide as Psychache: A Clinical Approach to Self-Destructive Behavior appeared in 1993 there have been four editions of American Psychiatric Association's (APA) Diagnostic and Statistical Manual of Mental Disorders (DSM), “psychache” has never appeared in the DSM.  That may seem an anomaly given much in the DSM revolves around psychological disturbances but the reason is technical.  What the DSM does is list and codify diagnosable mental disorders (depression, schizophrenia, bipolar disorder etc), classifying symptoms and behaviors into standardized categories for diagnosis and treatment planning.  By contrast, psychache is not a clinical diagnosis; it is a theoretical construct in suicidology which is used to explain the subjective experience of psychological pain that can lead to patients taking their own lives.  It thus describes an emotional state rather than a psychiatric disorder.

Lindsay Lohan and her lawyer in court, Los Angeles, December, 2011.

Despite that, mental health clinicians do actively use the principles of psychache, notably in suicide risk assessment and prevention and models have been developed including a number of “psychache scales”, self-reporting tools used to generate a metric measuring the intensity of psychological pain (categorized with headings such as shame, guilt, despair et al).  The approaches do in detail differ but most follow Dr Shneidman’s terminology in that the critical threshold is the point at which the patient’s pain becomes unbearable or inescapable and the objective is either to increase tolerance for distress or reframe troublesome thoughts.  Ultimately, the purpose of tools is to improve suicide risk assessments and reduce suicide rates.

DSM-5 (2013).

Interestingly, Suicidal Behavior Disorder (SBD) was introduced in Section III of the DSM-5 (2013) under “Conditions for Further Study”.  Then, SBD chiefly was characterized by a self-initiated sequence of behaviors believed at the time of initiation to cause one’s own death and occurring in the last 24 months.  That of course sounds exact but the diagnostic criteria in the DSM are written like that and the purpose of inclusion in the fifth edition was to create a framework so systematically, empirical studies related to SBD could be reviewed so primary research themes and promising directions for future research could be identified.  Duly, over the following decade that framework was explored but the conclusion was reached there seemed to be little utility in the clinical utility of SBD as a device for predicting future suicide and that more research was needed to understand measurement of the diagnosis and its distinctiveness from related disorders and other self-harming behaviors.  The phase “more research is required” must be one of the most frequently heard among researchers.

In the usually manner in which the APA allowed the DSM to evolve, what the DSM-5s tentative inclusion of SBD did was attempt to capture suicidality as a diagnosis rather than a clinical feature requiring attention.  SBD was characterized by a suicide attempt within the last 24 months (Criterion A) and that was defined as “a self-initiated sequence of behaviors by an individual who, at the time of initiation, expected that the set of actions would lead to his or her own death”.  That sounds uncontroversial but what was significant was the act could meet the criteria for non-suicidal self-injury (ie self-injury with the intention to relieve negative feelings or cognitive state in order to achieve a positive mood state (Criterion B) and cannot be applied to suicidal ideation or preparatory acts (Criterion C).  Were the attempt to have occurred during a state of delirium or confusion or solely for political or religious objectives, then SBD is ruled out (Criteria D & E).  SBD (current) is given when the suicide attempt occurred within the last 12 months, and SBD (in early remission), when it has been 12-24 months since the last attempt.  It must be remembered that while a patient’s behavior(s) may overlap across a number of the DSM’s diagnosises, the AMA’s committees have, for didactic purposes, always preferred to “silo” the categories.

DSM-5-TR (2022).

When in 2022 the “text revision” of the DSM-5 (DSM-5-TR) was released, SBD was removed as a condition for further study in Section III and moved to “Other Conditions That May Be a Focus of Clinical Attention” in Section II. The conditions listed in this section are intended to draw to attention of clinicians to the presence and breadth of additional issues routinely encountered in clinical practice and provide a procedure for their systematic documentation.  According to the APA’s editorial committee, the rationale for the exclusion of SBD from the DSM-5-TR was based on concerns the proposed disorder did not meet the criteria for a mental disorder but instead constituted a behavior with diverse causes and while that distinction may escape most of us, within the internal logic of the history of the DSM, that’s wholly consistent.  At this time, despite many lobbying for the adoption of a diagnostic entity for suicidal behavior, the APA’s committees seem still more inclined to conceptualize suicidality as a symptom rather than a disorder and despite discussion in the field of suicidology about whether suicide and related concepts like psychache should be treated as stand-alone mental health issues, that’s a leap which will have to wait, at least until a DSM-6 is published.

How to and how not to: Informatie over Zorgvuldige Levensbeëindiging (Information about the Careful Ending of Life, 2008) by Stichting Wetenschappelijk Onderzoek naar Zorgvuldige Zelfdoding (The Foundation for Scientific Research into Careful Suicide) (left) and How Not to Kill Yourself: A Phenomenology of Suicide (2023) by Clancy Martin (right).

Informatie over Zorgvuldige Levensbeëindiging (Information about the Careful Ending of Life, 2008) was published by a group of Dutch physicians & and researchers; it contained detailed advice on methods of suicide available to the general public, the Foundation for Scientific Research into Careful Suicide arging “a requirement exists within society for responsible information about an independent and dignified ending of life.”  It could be ordered only from the foundation’s website and had the advantage that whatever might be one’s opinion on the matter, it was at least written by physicians and scientists and thus more reliable than some of the “suicide guides” which are sometimes found on-line.  At the time research by the foundation had found that despite legislation in the Netherlands which permit doctors (acting within specific legal limits) to assist patient commit suicide, there were apparently several thousand cases each year of what it termed “autoeuthanasia” in which no medical staff directly were involved.  Most of these cases involved elderly or chronically ill patients who refused food and fluids and it was estimated these deaths happened at about twice the rate of those carried out under the euthanasia laws.  Since then the Dutch laws have been extended to included those who have no serious physical disease or are suffering great pain; there are people who simply no longer wish to live, something like the tragic figure in Blue Öyster Cult’s (Don't Fear) The Reaper (1976) © Donald Roeser (b 1947):

Came the last night of sadness
And it was clear she couldn't go on
Then the door was open and the wind appeared
The candles blew then disappeared
The curtains flew then he appeared
Saying don't be afraid

There is a diverse literature on various aspects of suicide (tips and techniques, theological & philosophical interpretations, cross-cultural attitudes, history of its treatment in church & secular law etc) and some are quite personal, written variously by those who later would kill themselves or those who contemplated or attempted to take their own lives.  In How Not to Kill Yourself: A Phenomenology of Suicide (2023) by Canadian philosopher Clancy Martin (b 1967), it was revealed the most recent of his ten suicide attempts was “…in his basement with a dog leash, the consequences of which he concealed from his wife, family, co-workers, and students, slipping back into his daily life with a hoarse voice, a raw neck and series of vague explanations.

BKA (the Bundeskriminalamt, the Federal Criminal Police Office of the FRG (Federal Republic of Germany (the old West Germany)) mug shots of the Red Army Faction's Ulrike Meinhof (left) and Gudrun Ensslin (right).

The song (Don't Fear) The Reaper also made mention of William Shakespeare's (1564–1616) Romeo and Juliet (1597) and in taking her own life (using her dead lover’s dagger) because she doesn’t want to go on living without him, Juliette joined the pantheon of figures who have made the tragedy of suicide seem, to some, romantic.  Politically too, suicide can grant the sort of status dying of old age doesn’t confer, the deaths of left-wing terrorists Ulrike Meinhof (1934–1976) and Gudrun Ensslin (1940–1977) of the West German Red Army Faction (the RAF and better known as the “Baader-Meinhof gang”) both recorded as “suicide in custody” although the circumstances were murky.  In an indication of the way moral relativities aligned during the high Cold War, the French intellectuals Jean-Paul Sartre (1905–1980) and Simone de Beauvoir (1908–1986) compared their deaths to the worst crimes of the Nazis but sympathy for violence committed for an “approved” cause was not the exclusive preserve of the left.  In July, 1964, in his speech accepting the Republican nomination for that year’s US presidential election, proto-MAGA Barry Goldwater (1909–1998) concluded by saying: “I would remind you that extremism in the defense of liberty is no vice!  And let me remind you also that moderation in the pursuit of justice is no virtue!  The audience response to that was rapturous although a few months later the country mostly didn’t share the enthusiasm, Lyndon Johnson (LBJ, 1908–1973; US president 1963-1969) winning the presidency in one of the greatest landslides in US electoral history.  Given the choice between crooked old Lyndon and crazy old Barry, Americans preferred the crook.

Nor was it just politicians and intellectuals who could resist the appeal of politics being taken to its logical “other means” conclusion, the Canadian singer-songwriter Leonard Cohen (1934-2016) during the last years of the Cold War writing First We Take Manhattan (1986), the lyrics of which were open to interpretation but clarified in 1988 by the author who explained: “I think it means exactly what it says.  It is a terrorist song.  I think it's a response to terrorism.  There's something about terrorism that I've always admired.  The fact that there are no alibis or no compromises.  That position is always very attractive.   Even in 1988 it was a controversial comment because by then not many outside of undergraduate anarchist societies were still romanticizing terrorists but in fairness to the singer the coda isn’t as often published: “I don't like it when it's manifested on the physical plane – I don't really enjoy the terrorist activities – but Psychic Terrorism.

First We Take Manhattan (1986) by Leonard Cohen

They sentenced me to twenty years of boredom
For tryin' to change the system from within
I'm coming now, I'm coming to reward them
First we take Manhattan, then we take Berlin
 
I'm guided by a signal in the heavens
I'm guided by this birthmark on my skin
I'm guided by the beauty of our weapons
First we take Manhattan, then we take Berlin
 
I'd really like to live beside you, baby
I love your body and your spirit and your clothes
But you see that line there moving through the station?
I told you, I told you, told you, I was one of those
 
Ah you loved me as a loser, but now you're worried that I just might win
You know the way to stop me, but you don't have the discipline
How many nights I prayed for this, to let my work begin
First we take Manhattan, then we take Berlin
 
I don't like your fashion business, mister
And I don't like these drugs that keep you thin
I don't like what happened to my sister
First we take Manhattan, then we take Berlin
 
I'd really like to live beside you, baby
I love your body and your spirit and your clothes
But you see that line there moving through the station?
I told you, I told you, told you, I was one of those



First We Take Manhattan performed by Jennifer Warnes (b 1947), from the Album Famous Blue Raincoat (1986). 

Whatever they achieved in life, it was their suicides which lent a lingering allure to German-American ecofeminist activist Petra Kelly (1947–1992) & the doomed poet American poet Sylvia Plath (1932-1963) and the lure goes back for millennia, the Roman Poet Ovid (Publius Ovidius Naso; 43 BC–17 AD) in his Metamorphoses telling an ancient Babylonian tale in which Pyramus, in dark despair, killed herself after finding her young love lifeless.  Over the centuries it’s been a recurrent trope but the most novel take was the symbolic, mystical death in Richard Wagner's (1813–1883) Tristan und Isolde (1865).  Mortally wounded in a duel before the final act, Tristan longs to see Isolde one last time but just as she arrives at his side, he dies in her arms.  Overwhelmed by love and grief, Isolde sings the famous Liebestod (Love-Death) and dies, the transcendent aria interpreted as the swansong which carries her to join Tristan in mystical union in the afterlife.  This, lawyers would call a “constructive suicide”.

Austrian soprano Helga Dernesch (b 1939) in 1972 performing the Liebestod aria from Wagner’s Tristan und Isolde with the Berlin Philharmonic under Herbert von Karajan (1908–1989).

While she didn’t possess the sheer power of the greatest of the Scandinavian sopranos who in the mid-twentieth century defined the role, Dernesch brought passion and intensity to her roles and while, on that night in 1972, the lushness of what Karajan summoned from the strings was perhaps a little much, her Liebestod was spine-tingling and by then, Karajan had been forgiven for everything.  Intriguingly, although Tristan und Isolde is regarded as one of the great monuments to love, in 1854 Wagner had written to the Hungarian composer Franz Liszt (1811–1886) telling him:

As I have never in life felt the real bliss of love, I must erect a monument to the most beautiful of all my dreams, in which, from beginning to end, that love shall be thoroughly satiated.  I have in my head ‘Tristan and Isolde’, the simplest but most full-blooded musical concepion; with the ‘black flag’ which floats at the end of it I shall cover myself to die.

It’s not known whether Listz reflected on this apparent compositional self-medication for psychache after in 1870 learning from his morning newspaper his daughter Cosima (1837-1930) was to be married to Wagner (then 24 years her senior) but because she’d been for some seven years conducting an adulterous affair with the German the news may not have been unexpected.  He was aware Cosmia’s daughter (Isolde Beidler (1865–1919)) had been fathered not by her then husband (the German conductor Hans von Bülow (1830–1894)) but by Wagner and her second marriage proved happier than the first so there was that.

Sunday, November 17, 2024

Now

Now (pronounced nou)

(1) At the present time or moment (literally a point in time).

(2) Without further delay; immediately; at once; at this time or juncture in some period under consideration or in some course of proceedings described.

(3) As “just now”, a time or moment in the immediate past (historically it existed as the now obsolete “but now” (very recently; not long ago; up to the present).

(4) Under the present or existing circumstances; as matters stand.

(5) Up-to-the-minute; fashionable, encompassing the latest ideas, fads or fashions (the “now look”, the “now generation” etc).

(6) In law, as “now wife”, the wife at the time a will is written (used to prevent any inheritance from being transferred to a person of a future marriage) (archaic).

(7) In phenomenology, a particular instant in time, as perceived at that instant.

Pre 900: From the Middle English now, nou & nu from the Old English (at the present time, at this moment, immediately), from the Proto-West Germanic , from the Proto-Germanic nu, from the primitive Indo-European (now) and cognate with the Old Norse nu, the Dutch nu, the German nun, the Old Frisian nu and the Gothic .  It was the source also of the Sanskrit and Avestan nu, the Old Persian nuram, the Hittite nuwa, the Greek nu & nun, the Latin nunc, the Old Church Slavonic nyne, the Lithuanian and the Old Irish nu-.  The original senses may have been akin to “newly, recently” and it was related to the root of new.  Since Old English it has been often merely emphatic, without any temporal sense (as in the emphatic use of “now then”, though that phrase originally meant “at the present time”, and also (by the early thirteenth century) “at once”.  In the early Middle English it often was written as one word.  The familiar use as a noun (the present time) emerged in the late fourteenth century while the adjective meaning “up to date” is listed by etymologists as a “mid 1960s revival” on the basis the word was used as an adjective with the sense of “current” between the late fourteenth and early nineteenth centuries.  The phrase “now and then” (occasionally; at one time and another) was in use by the mid 1400s, “now or never” having been in use since the early thirteenth century.  “Now” is widely used in idiomatic forms and as a conjunction & interjection.  Now is a noun, adjective & adverb, nowism, nowness & nowist are nouns; the noun plural is nows.

Right here, right now: Acid House remix of Greta Thunberg’s (b 2003) How dare you? speech by Theo Rio.

“Now” is one of the more widely used words in English and is understood to mean “at the present time or moment (literally a point in time)”.  However, it’s often used in a way which means something else: Were one to say “I’ll do it now”, in the narrow technical sense that really means “I’ll do it in the near future”.  Even things which are treated as happening “now” really aren’t such as seeing something.  Because light travels at a finite speed, it takes time for it to bounce from something to one’s eye so just about anything one sees in an exercise in looking back to the past.  Even when reading something on a screen or page one’s brain is processing something from a nanosecond (about one billionth of a second) earlier.  For most purposes, “now” is but a convincing (an convenient) illusion and even though, in certain, special sense, everything in the universe is happening at the same time (now) it’s not something that can ever be experienced because of the implications of relativity.  None of this causes many problems in life but among certain physicists and philosophers, there is a dispute about “now” and there are essentially three factions: (1) that “now” happened only once in the history of the known universe and cannot again exist until the universe ends, (2) that only “now” can exist and (3) that “now” cannot ever exist.

Does now exist? (2013), oil & acrylic on canvas by Fiona Rae (b 1963) on MutualArt.

The notion that “now” can have happened only once in the history of our universe (and according to the cosmological theorists variously there may be many universes (some which used to exist, some exact duplicates of our own (even containing an identical Lindsay Lohan), some extant and some yet to be created) or our universe may now be in one of its many phases, each which will start and end with a unique “now”) is tied up with the nature of time, the mechanism upon which “now” depends not merely for definition but also for existence.  That faction deals with what is essentially an intellectual exercise whereas the other two operate where physics and linguistics intersect.  Within the faction which says "now can never exist" there is a sub-faction which holds that to say “now” cannot exist is a bit of a fudge in that it’s not that “now” never happens but only that it can only every be described as a particular form of “imaginary time”; an address in space-time in the past or future.  The purists however are absolutists and their proposition is tied up in the nature of infinity, something which renders it impossible ever exactly to define “now” because endlessly the decimal point can move so that “now” can only ever be tended towards and never attained.  If pushed, all they will concede is that “now” can be approximated for purposes of description but that’s not good enough: there is no now.

nower than now!: Lindsay Lohan on the cover of i-D magazine No.269, September, 2006.

The “only now can exist” faction find tiresome the proposition that “the moment we identify something as happening now, already it has passed”, making the point that “now” is the constant state of existence and that a mechanism like time exists only a thing of administrative convenience.  The “only now can exist” faction are most associated with the schools of presentism or phenomenology and argue only the present moment (now) is “real” and that any other fragment of time can only be described, the past existing only in memory and the future only as anticipation or imagination; “now” is the sole verifiable reality.  They are interested especially in what they call “change & becoming”, making the point the very notion of change demands a “now”: events happen and things become in the present; without a “now”, change and causality are unintelligible.  The debate between the factions hinges often on differing interpretations of time: whether fundamentally it is subjective or objective, continuous or discrete, dynamic or static.  Linguistically and practically, “now” remains central to the human experience but whether it corresponds to an independent metaphysical reality remains contested.

Unlike philosophers, cosmologists probably don’t much dwell on the nature of “now” because they have the “Andromeda paradox” which is one of the consequences of Albert Einstein’s (1879-1955) theory of special relativity.  What the paradox does is illustrate the way “now” is relative and differs for observers moving at different speeds, the effect increasing as distances increase, such as when the point of reference is the Andromeda galaxy, some 2½ million light years distant from Earth.  Under special relativity, what one observer sees and perceives as “now” on Andromeda will, by another, moving at a different relative speed, will perceive as occurring in the past or future.   This can happen at any distance but, outside of computer simulations or laboratories, the effects of relative simultaneity is noticeable (even for relatively slow speeds) only at distance.  The way to conceptualize special relatively to imagine everying in the universe happening "at the same time" and "work backwards" as distances between objects increase.  

Seated vis-a-vis (literally "face to face"), Lindsay Lohan (b 1986, right) and her sister Aliana (b 1993, left), enjoying a tête-à-tête (literally, "head to head"), La Conversation bakery & café, West Hollywood, California, April 2012.  Sadly, La Conversation is now closed.

Among the implications of the Andromeda paradox is that although the sisters would have thought their discussion something in the "here and now", to a cosmologist they are looking at each other as they used to be and hearing what each said some time in the past, every slight movement affecting the extent of this.  Because, in a sense, everything in the universe is happening "at the same time", the pair could have been sitting light years apart and spoke what they spoke "at the same time" but because of the speed at which light and sound travel, it's only at a certain distance a "practical" shared "now" becomes possible.  One wholly speculative notion even connects "now" with at least one of the mysterious pair "dark energy" & "dark matter", two possibly misleading terms used to describe the "missing stuff" which the cosmologists' models indicate must exist (in great quantity) in the universe but haven't been yet been identified or even described.  The idea is that "dark energy" may be time itself, the implication being that not only can time be used as a measure in the currently accelerating expansion of space but may be the very force responsible for the phenomenon.  Thus far, the speculation has attracted little support and nobody seems to have suggested any sort of experimental test but, the standard is science is disproof and "time may be dark energy" has yet to be disproved.      

Thursday, September 19, 2024

Evil

Evil (pronounced ee-vuhl)

(1) Morally wrong or bad; immoral; wicked; morally corrupt.

(2) Harmful; injurious (now rare).

(3) Marked or accompanied by misfortune (now rare; mostly historic).

(4) Having harmful qualities; not good; worthless or deleterious (obsolete).

Pre 900: From the Middle English evel, ivel & uvel (evil) from the Old English yfel, (bad, vicious, ill, wicked) from the Proto-Germanic ubilaz.  Related were the Saterland Frisian eeuwel, the Dutch euvel, the Low German övel & the German übel; it was cognate with the Gothic ubils, the Old High German ubil, the German übel and the Middle Dutch evel and the Irish variation abdal (excessive).  Root has long been thought the primitive Indo-European hupélos (diminutive of hwep) (treat badly) which produced also the Hittite huwappi (to mistreat, harass) and huwappa (evil, badness) but an alternative view is a descent from upélos (evil; (literally "going over or beyond (acceptable limits)")) from the primitive Indo-European upo, up & eup (down, up, over).  Evil is a noun & adjective (some do treat it as a verb), evilness is a noun and evilly an adverb; the noun plural is evils.

Evil (the word) arrived early in English and endured.  In Old English and all the early Teutonic languages except the Scandinavian, it quickly became the most comprehensive adjectival expression of disapproval, dislike or disparagement.  Evil was the word Anglo-Saxons used to convey some sense of the bad, cruel, unskillful, defective, harm, crime, misfortune or disease.  The meaning with which we’re most familiar, "extreme moral wickedness" existed since Old English but did not assume predominance until the eighteenth century.  The Latin phrase oculus malus was known in Old English as eage yfel and survives in Modern English as “evil eye”.  Evilchild is attested as an English surname from the thirteenth century and Australian-born Air Chief Marshall Sir Douglas Evill (1892-1971) was head of the Royal Air Force (RAF) delegation to Washington during World War II (1939-1945).  Despite its utility, there’s probably no word in English with as many words of in the same vein without any being actually synonymous.  Consider: destructive, hateful, vile, malicious, vicious, heinous, ugly, bad, nefarious, villainous, corrupt, malefic, malevolent, hideous, wicked, harm, pain, catastrophe, calamity, ill, sinful, iniquitous, depraved, vicious, corrupt, base, iniquity & unrighteousness; all tend in the direction yet none quite matches the darkness of evil although malefic probably come close.  

Hannah Arendt and the banality of evil

The word evil served English unambiguously and well for centuries and most, secular and spiritual, knew that some people are just evil.  It was in the later twentieth century, with the sudden proliferation of psychologists, interior decorators, sociologists, criminologists, social workers and basket weavers that an industry developed exploring alternative explanations and causations for what had long been encapsulated in the word evil.  The output was uneven but among the best remembered, certainly for its most evocative phrase, was in the work of German-American philosopher and political theorist Hannah Arendt (1906–1975).  Arendt’s concern, given the scale of the holocaust was: "Can one do evil without being evil?"

Whether the leading Nazis were unusually (or even uniquely) evil or merely individuals who, through a combination of circumstances, came to do awful things has been a question which has for decades interested psychiatrists, political scientists and historians.  Arendt attended the 1961 trial of Adolph Eichmann (1906-1962), the bureaucrat responsible for transportation of millions of Jews and others to the death camps built to allow the Nazis to commit the industrial-scale mass-murder of the final solution.  Arendt thought Eichmann ordinary and bland, “neither perverted nor sadistic” but instead “terrifyingly normal”, acting only as a diligent civil servant interested in career advancement, his evil deeds done apparently without ever an evil thought in his mind.  Her work was published as Eichmann in Jerusalem: A Report on the Banality of Evil (1963).  The work attracted controversy and perhaps that memorable phrase didn’t help.  It captured the popular imagination and even academic critics seemed seduced.  Arendt’s point, inter alia, was that nothing in Eichmann’s life or character suggested that had it not been for the Nazis and the notion of normality they constructed, he’d never have murdered even one person.  The view has its flaws in that there’s much documentation from the era to prove many Nazis, including Eichmann, knew what they were doing was a monstrous crime so a discussion of whether Eichmann was immoral or amoral and whether one implies evil while the other does not does, after Auschwitz, seems a sterile argument.

Evil is where it’s found.

Hannah Arendt's relationship with Martin Heidegger (1889–1976) began when she was a nineteen year old student of philosophy and he her professor, married and aged thirty-six.  Influential still in his contributions to phenomenology and existentialism, he will forever be controversial because of his brief flirtation with the Nazis, joining the party and taking an academic appointment under Nazi favor.  He resigned from the post within a year and distanced himself from the party but, despite expressing regrets in private, never publicly repented.  His affair with the Jewish Arendt is perhaps unremarkable because it pre-dated the Third Reich but what has always attracted interest is that their friendship lasted the rest of their lives, documented in their own words in a collection of their correspondence (Letters: 1925-1975, Hannah Arendt & Martin Heidegger (2003), Ursula Ludz (Editor), Andrew Shields (Translator)).  Cited sometimes as proof that feelings can transcend politics (as if ever there was doubt), the half-century of letters which track the course of a relationship which began as one of lovers and evolved first into friendship and then intellectual congress.  For those who wish to explore contradiction and complexity in human affairs, it's a scintillating read.  Arendt died in 1975, Heidegger surviving her by some six months.

New York Post, November 1999.

In 1999, Rupert Murdoch’s (b 1931) tabloid the New York Post ran one of their on-line polls, providing a list of the usual suspects, asking readers to rate the evil to most evil, so to determine “The 25 most evil people of the last millennium” and, predictably, Adolf Hitler (1889-1945; Führer (leader) and German head of government 1933-1945 & head of state 1934-1945) was rated the worst.  The poll received 19184 responses which revealed some “recency bias” (a cognitive bias that favors recent events over historic ones) in that some US mass-murderers were rated worse than some with more blood on their hands but most commented on was the stellar performance of the two “write-ins”: Bill Clinton (b 1946; US president 1993-2001) & his loyal wife, crooked Hillary Clinton (b 1947; US secretary of state 2009-2013), the POTUS coming second and the FLOTUS an impressive sixth, the Post's reader's rating both more evil than Saddam Hussein (1937–2006; president of Iraq 1979-2003), Vlad the Impaler (Vlad Dracula or Prince Vlad III of Wallachia (circa 1430-circa 1477); thrice Voivode of Wallachia 1448-circa 1477 or Ivan the Terrible (Ivan IV Vasilyevich (1530–1584; Grand Prince of Moscow and all Russia 1533-1584 & Tsar of all Russia 1547-1584).  Still, by a small margin (8.67% of the vote against 8.47), Mr Murdoch's readers rated Hitler more evil than Bill Clinton so there was that.

Lindsay Lohan and her lawyer in court, Los Angeles, December 2011.

While fun and presumably an indication of something, on-line polls should not be compared with the opinion polls run by reputable universities or polling organizations, their attraction for editors looking for click-bait being they’re essentially free and provide a result, sometimes within a day, unlike conventional polls which can cost thousands or even millions depending on the sample size and duration of research.  The central problem with on-line polls is that responders are self-selected rather than coming from a cohort determined by a statistical method developed in the wake of the disastrously inaccurate results of a poll “predicting” national voting intentions in the 1936 presidential election.  The 1936 catchment had been skewered towards the upper-income quartile by being restricted to those who answered domestic telephone connections, the devices then rarely installed in lower-income households.  A similar phenomenon of bias is evident in the difference on-line responses to the familiar question: “Who won the presidential debate?”, the divergent results revealing more about the demographic profiles of the audiences of CBS, MSNBC, CNN, ABC & FoxNews than on-stage dynamics on-stage.

Especially among academics in the social sciences, there are many who object to the frequent, almost casual, use of “evil”, applied to figures as diverse as serial killers and those who use the “wrong” pronoun.  Rightly on not, academics can find “complexity” in what appears simple to most and don’t like “evil” because of the simple moral absolutism it implies, the suggestion certain actions or individuals are inherently or objectively wrong.  Academics call this “an over-simplification of complex ethical situations” and they prefer the nuances of moral relativism, which holds that moral judgments can depend on cultural, situational, or personal contexts.  The structuralist-behaviorists (a field still more inhabited than a first glance may suggest) avoid the word because it so lends itself to being a “label” and the argument is that labeling individuals as “evil” can be both an act of dehumanizing and something which reinforces a behavioral predilection, thereby justifying punitive punishment rather than attempting rehabilitation.  Politically, it’s argued, the “evil” label permits authorities to ignore or even deny allegedly causative factors of behavior such as poverty, mental illness, discrimination or prior trauma.  Despite the intellectual scepticism, the word “evil” does seem to exert a pull and its implications are such there's really no substitute if one is trying to say certain things.  In À la recherche du temps perdu (In Search of Lost Time (1913–1927)), Marcel Proust (1871-1922) left the oft-quoted passage: “Perhaps she would not have considered evil to be so rare, so extraordinary, so estranging a state, to which it was so restful to emigrate, had she been able to discern in herself, as in everyone, that indifference to the sufferings one causes, an indifference which, whatever other names one may give it, is the terrible and permanent form of cruelty. 

There are also the associative traditions of the word, the linkages to religion and the supernatural an important part of the West’s cultural and literary inheritance but not one universally treated as “intellectually respectable”.  Nihilists of course usually ignore the notion of evil and to the post-modernists it was just another of those “lazy” words which ascribed values of right & wrong which they knew were something wholly subjective, evil as context-dependent as anything else.  Interestingly, in the language of the polarized world of US politics, while the notional “right” (conservatives, MAGA, some of what’s left of the Republican Party) tends to label the notional “left” (liberals, progressives, the radical factions of the Democratic Party) as evil, the left seems to depict their enemies (they’re no longer “opponents”) less as “evil” and more as “stupid”.

The POTUS & the pontiff: Francis & Donald Trump (aka the lesser of two evils), the Vatican, May 2017.

Between the pontificates of Pius XI (1857–1939; pope 1922-1939) and  Francis (b 1936; pope since 2013), all that seems to have changed in the Holy See’s world view is that civilization has moved from being threatened by communism, homosexuality and Freemasony to being menaced by Islam, homosexuality and Freemasony.  It therefore piqued the interest of journalists accompanying the pope on his recent 12-day journey across Southeast Asia when they were told by a Vatican press secretary his Holiness would, during the scheduled press conference, discuss the upcoming US presidential election: duly, the scribes assembled in their places on the papal plane. The pope didn’t explicitly tell people for whom they should vote nor even make his preference obvious as Taylor Swift (b 1989) would in her endorsement mobilizing the childless cat lady vote but he did speak in an oracular way, critiquing both Kamala Harris (b 1964; US vice president since 2021) and Donald Trump (b 1946; US president 2017-2021) as “against life”, urging Catholic voters to choose the “lesser of two evils.”  That would have been a good prelude had he gone further but there he stopped: “One must choose the lesser of two evils. Who is the lesser of two evils?  That lady or that gentleman? I don’t know.

Socks (1989-2009; FCOTUS (First Cat of the United States 1993-2001)) was Chelsea Clinton's (b 1980; FDOTUS (First Daughter of the United States 1993-2001)) cat.  Cartoon by Pat Oliphant, 1996.

The lesser of two evils: Australian-born US political cartoonist Pat Oliphant’s (b 1935) take on the campaign tactics of Bill Clinton (b 1946; US president 1993-2001) who was the Democratic Party nominee in the 1996 US presidential election against Republican Bob Dole (1923–2021).  President Clinton won by a wide margin which would have been more handsome still, had there not been a third-party candidate.  Oliphant’s cartoons are now held in the collection of the National Library of Congress.  It’s not unusual for the task presented to voters in US presidential elections to be reduced to finding “the lesser of two evils”.  In 1964 when the Democrats nominated Lyndon Johnson (LBJ, 1908–1973; US president 1963-1969) to run against the Republican's Barry Goldwater (1909–1998), the conclusion of many was it was either “a crook or a kook”.  On the day, the lesser of the two evils proved to be crooked old Lyndon who won in a landslide over crazy old Barry.

Francis has some history in criticizing Mr Trump’s handling of immigration but the tone of his language has tended to suggest he’s more disturbed by politicians who support the provision of abortion services although he did make clear he sees both issues in stark moral terms: “To send migrants away, to leave them wherever you want, to leave them… it’s something terrible, there is evil there. To send away a child from the womb of the mother is an assassination, because there is life. We must speak about these things clearly.  Francis has in the past labelled abortion a “plague” and a “crime” akin to “mafia” behavior, although he did resist suggestions the US bishops should deny Holy Communion to “pro-choice” politicians (which would have included Joe Biden (b 1942; US president 2021-2025), conscious no doubt that accusations of being an “agent of foreign interference” in the US electoral process would be of no benefit.  Despite that, he didn’t seek to prevent the bishops calling abortion is “our preeminent priority” in Forming Consciences for Faithful Citizenship, the 2024 edition of their quadrennial document on voting.  Some 20% of the US electorate describe themselves as Catholics, their vote in 2020 splitting 52/47% Biden/Trump but that was during the Roe v Wade (1973) era and abortion wasn’t quite the issue it's since become and surveys suggest a majority of the faith believe it should be available with only around 10% absolutist right-to-lifers.  Analysts concluded Francis regards Mr Trump as less evil than Ms Harris and will be pleased if his flock votes accordingly; while he refrained from being explicit, he did conclude: “Not voting is ugly.  It is not good.  You must vote.