Showing posts sorted by date for query Evil. Sort by relevance Show all posts
Showing posts sorted by date for query Evil. Sort by relevance Show all posts

Friday, July 11, 2025

Dixiecrat

Dixiecrat (pronounced dik-see-krat)

(1) In US political history, a member of a faction of southern Democrats stressing states' rights and opposed to the civil rights programs of the Democratic Party, especially a southern Democrat who left the party in 1948 to support candidates of the States' Rights Democratic Party.

(2) In historic US use, a member of the US Democratic Party from the southern states (especially one of the former territories of the Confederacy), holding socially conservative views, supporting racial segregation and the continued entrenchment of a white hegemony.

1948: A portmanteau word of US origin, the construct being Dixie + (Demo)crat.  Wholly unrelated to other meanings, Dixie (also as Dixieland) in this context is a reference to the southern states of the United States, especially those formerly part of the Confederacy.  The origin is contested, the most supported theory being it’s derived from the Mason-Dixon Line, a historic (if not entirely accurate) delineation between the "free" North and "slave-owning" South.  Another idea is it was picked up from any of several songs with this name, especially the minstrel song Dixie (1859) by (northerner) Daniel Decatur Emmett (1815-1904), popular as a Confederate war song although most etymologists hold this confuses cause and effect, the word long pre-dating any of the known compositions.  There’s also a suggested link to the nineteenth-century nickname of New Orleans, from the dixie, a Confederate-era ten-dollar bill on which was printed the French dix (ten) but again, it came later.  The –crat suffix was from the Ancient Greek κράτος (krátos) (power, might), as used in words of Ancient Greek origin such as democrat and aristocrat; the ultimate root was the primitive Indo-European kret (hard).  Dixiecrat is a noun and Dixiecratic is an adjective; the noun plural is Dixiecrats.  The noun Dixiecratocracy (also as dixieocracy) was a humorous coining speculating about the nature of a Dixiecrat-run government; it was built on the model of kleptocracy, plutocracy, meritocracy, gerontocracy etc.

The night old Dixie died.

Former Dixiecrat, Senator Strom Thurmond (1902-2003; senator (Republican) for South Carolina 1954-2003) lies in state, Columbia, South Carolina, June 2003.

Universally called Dixiecrats, the States' Rights Democratic Party was formed in 1948 as a dissident breakaway from the Democratic Party.  Its core platform was permanently to secure the rights of states to legislate and enforce racial segregation and exclude the federal government from intervening in these matters.  Politically and culturally, it was a continuation of the disputes and compromises which emerged in the aftermath of the US Civil War almost a century earlier.  The Dixiecrats took control of the party machine in several southern states and contested the elections of 1948 with South Carolina governor Strom Thurmond as their presidential nominee but enjoyed little support outside the deep South and by 1952 most had returned to the Democratic Party.  However, in the following decades, they achieved a much greater influence as a southern faction than ever was achieved as a separatist party.  The shift in the south towards support for the Republican Party dates from this time and by the 1980s, the Democratic Party's control of presidential elections in the South had faded and many of the Dixiecrats had joined the Republicans.

US Electoral College map, 1948.

In the 1948 presidential election, the Dixiecrats didn’t enjoy the success polls had predicted (although that was the year of the infamous “Dewey Defeats Truman” headline and the polls got much wrong), carrying only four states, all south of the Mason-Dixon line and not even the antics of one “faithless elector” (one selected as an elector for the Democratic ticket who instead cast his vote for Dixiecrats) was sufficient to add Tennessee to the four (South Carolina, Mississippi, Alabama, and Louisiana) won.  Nor did they in other states gain sufficient support to act as “spoilers” as Ross Perot (1930–2019) in 1992 & 1996 and Ralph Nadar (b 1934) in 2000 achieved, the “narrowing of margins” in specific instances being of no immediate electoral consequence in the US system.  With that, the Dixiecrats (in the sense of the structure of the States' Rights Democratic Party) in a sense vanished but as an idea they remained for decades a potent force within the Democratic Party and their history is an illustration of why the often-quoted dictum by historian Professor Richard Hofstadter (1916–1970): “The role of third parties is to sting like a bee, then die” needs a little nuance.  What the Dixiecrats did after 1948 was not die but instead undergo a kind of “resurrection without crucifixion”, emerging to “march through the institutions” of the Democratic Party, existing as its southern faction.

That role was for generations politically significant and example of why the “third party” experience in the US historically wasn’t directly comparable with political behaviour elsewhere in the English-speaking world where “party discipline” tended to be “tight” with votes on the floors of parliaments almost always following party lines.  Until recent years (and this is something the “Trump phenomenon” radically has at least temporarily almost institutionalized), there was often only loose party discipline applied within the duopoly, Democrats and Republicans sometimes voting together on certain issues because the politicians were practical people who wished to be re-elected and understood what Tip O'Neill (1912–1994; (Democrat) speaker of the US Representatives 1977-1987) meant when he said “All politics is local”.  Structurally, that meant “third parties” can operate in the US and achieve stuff (for good or evil) as the Dixiecrats and later the Republican’s Tea Party Movement proved; it just that they do it as factions within the duopoly and that’s not unique, the Australian National Party (a re-branding of the old Country Party) really a regional pressure group of political horse traders disguised as a political party.

US Electoral College map, 1924.

The 1924 Electoral College results were a harbinger of the later Dixiecrat movement and a graphical representation of terms such as "solid South" or "south of the Mason-Dixon Line".  At the time of the 1924 election, slavery in the South was still in living memory.  Although there was fracturing at the edges, the "solid south" did remain a Democratic Party stronghold until the civil rights legislation of the 1960s and it was was the well-tuned political antennae of Texan Lyndon Johnson (LBJ, 1908–1973; US president 1963-1969) which picked up the implications and consequences of the reforms his skills had ushered through the Congress:  "I think I've just lost us the South" he was heard to remark when the Senate passed a landmark voting rights bill into law.

In recent years, what has changed in the US is the Republicans and Democrats have become the captive organizations of a tiny number of dedicated political operatives pursuing either their own ideological agendas or (more typically), those providing the funding.  The practical implication of that is the elections which now most matter are the primaries (where candidates for the election proper are selected) and because primary contests are voted on by a relative handful, outcomes are easier to influence and control that in general elections where there are millions to nudge.  Party discipline has thus become tighter than can often be seen on the floor of the House of Commons in the UK, not because the ideological commitments of politicians within parties have coalesced but because they’re now terrified of being “primaried” if they vote against the party line.  Re-election is a powerful inducement because the money politicians make during their careers is many, many times what might be expected given their notional earnings from their salary and entitlements.  There are few easier ways to get rich, thus the incentive to “toe the party line”.  This behavioural change, mapped onto something which structurally remains unchanged, is one of the many factors which have produced a country now apparently as polarized as ever it has been.  The nature of that polarization is sometimes misunderstood because of the proliferation of “red state, blue state” maps of the US which make the contrast between the “corrupting coastlines” and “flyover states” seem so stark but each state is of course a shade of purple (some darker, some lighter) but because of the way the two parties now operate, politics as it is practiced tends to represent the extreme, radical elements which now control the machines.  So while in the last twenty-odd years there’s been much spoken about “the 1%” in the sense of the tiny number of people who own or control so much, it’s political scientists and historians who much fret over the less conspicuous “1%” able to maintain effective control of the two parties, something of even greater significance because the state has put in place some structural impediments to challenging the two-party political duopoly.

In the US, the state does not (in a strict legal or constitutional sense of the word) “own” the Republican or Democratic Parties because they are “private” organizations protected by the constitution’s First Amendment (freedom of association).  However, over the years, something biologists would recognize as “symbiosis” has evolved as the state and the parties (willingly and sometimes enthusiastically) have become entangled to the extent a structural analysis would recognize the parties as quasi-public although not quite at the status familiar elsewhere as quangos (quasi autonomous non-government organizations).  Despite being “private concerns”, the parties routinely conduct state-regulated primaries to select candidates and in many cases these are funded by tax revenue and administered by state electoral instrumentalities.  Beyond that, it needs to be remembered that to speak of a “US national election” (as one might of a “UK general election”) is misleading because as a legal construct such events are really 50 elections run by each state with electoral laws not wholly aligned (thus the famous (or dreaded, depending on one’s position) Iowa caucuses) and in many states, it’s state law which regulates who can voted in party primaries, some permitting “open” primaries in which any lawfully enrolled voter is allowed to cast a ballot while others run “closed” events, restricting participation to registered members of the relevant party.  What that means is in some places a citizen can vote in each party’s primary.  That done, those who prevail in a primary further are advantaged because many states have laws setting parameters governing who may appear on a ballot paper and most of them provide an easier path for the Republican and Democratic Party candidates by virtue of having granted both “major party” status.  As objects, the two parties, uniquely, are embedded in the electoral apparatus and the interaction of ballot access laws, debate rules and campaign finance rules mean the two function as state-sponsored actors; while not quite structurally duopolistic, they operate in a protected environment with the electoral equivalent of “high tariff barriers”.

Elon Musk (left) and Donald Trump (right), with Tesla Cybertruck (AWD Foundation Series), the White House, March, 2025.  It seemed like a good idea at the time.

Given all that, Elon Musk’s (b 1971) recent announcement he was planning to launch a “third party” (actually the US has many political parties, the “third party” tag used as a synecdoche for “not one of the majors”) might seem “courageous” and surprised many who thought the experience of his recent foray into political life might have persuaded him pursuits like EVs (electric vehicles), digging tunnels (he deserves praise for naming that SpaceX spin-off: “The Boring Company”) and travelling to Mars were more fulfilling.  However, Mr Musk believes the core of the country’s problems lie in the way its public finances are now run on the basis of the “Dick Cheney (born 1941; US vice president 2001-2009) doctrine: “Deficits don’t matter” and having concluded neither of the major parties are prepared to change the paradigm which he believes is leading the US to a fiscal implosion, a third party is the only obvious vehicle.  In Western politics, ever since shades of “socialism” and “capitalism” defined the democratic narrative, the idea of a “third way” has been a lure for theorists and practitioners with many interpretations of what is meant but all have in common what Mr Musk seems to be suggesting: finding the middle ground and offering it to those currently voting for one or other of the majors only because “your extremists are worse than our extremists”.  Between extremes there’s much scope for positioning (which will be variable between “social” & “economic” issues) and, given his libertarian instincts, it seems predicable Mr Musk’s economic vision will be “centre-right” rather than “centre-left” but presumably he’ll flesh out the details as his venture evolves.

Mr Musk can’t be accused of creating a “third party” because he wants to become POTUS (president of the US).  As a naturalized US citizen, Mr Musk is ineligible because Article II, Section 1, Clause 5 of the constitution restricts the office to those who are a “natural born Citizen” (Article II, Section 1, Clause 5).  Because the US Supreme Court (USSC) has never handed down a definitive ruling on the matter it’s not absolutely certain what that phrase means but the consensus among legal scholars is it refers to someone who was at birth a US citizen.  That need not necessitate being born on the soil of the US or its territories because US citizens often are born in other countries (especially to those on military or diplomatic duty) and even in international waters; indeed, there would appear no constitutional impediment to someone born in outer space (or, under current constitutional interpretation, on Mars) becoming POTUS provided they were at the time of birth a US citizen.  Nor does it seem an interpretation of the word “natural” could be used to exclude a US citizen conceived through the use of some sort of “technology” such as IVF (In Vitro Fertilization).

Lindsay Lohan, potential third party POTUS.

As a naturalized US citizen, Elon Musk can’t become POTUS so his new party (tentatively called the “America” Party) will have to nominate someone else and the constitution stipulates (Article II, Section 1, Clause 5): “No Person except a natural born Citizen, or a Citizen of the United States, at the time of the Adoption of this Constitution, shall be eligible to the Office of President; neither shall any Person be eligible to that Office who shall not have attained to the Age of thirty five Years, and been fourteen Years a Resident within the United States”.  The age requirement is unambiguous and in his Commentaries on the Constitution of the United States (1833), Justice Joseph Story (1779–1845; associate justice of the Supreme Court of the USSC 1812-1845) explained the residence requirement was “…not an absolute inhabitancy within the United States during the whole period; but such an inhabitancy as includes a permanent domicil in the United States.  That means Mr Musk can consider nominating Lindsay Lohan for president.  She’d apparently flirted with the idea of running in 2020 but at that point would have been a few months too young; on all grounds she’ll be eligible for selection in 2028 and many would be attracted to the idea of Lindsay Lohan having her own nuclear weapons.

Whether or not it’s “courageous” (or even “heroic”), to build a new third party in the US time will tell but certainly it’s ambitious but Mr Musk is also a realist and may not be planning to have a presidential candidate on the ballot in all 50 states or even contest every seat both houses of Congress.  As he’ll have observed in a number of countries, “third parties” need neither parliamentary majorities nor executive office to achieve decisive influence over policy, some with comparatively little electoral support able to achieve “balance of power” status in legislatures provided those votes are clustered in the right places.  Additionally, because the polarized electorate has delivered such close results in the House & Senate, the math suggests a balance of power may be attainable with fewer seats than historically would have been demanded and under the US system of fixed terms, an administration cannot simply declare such a congress “unworkable” and all another election (a common tactic in the Westminster system); it must, for at least two years, work with what the people have elected, even if that includes an obstreperous third party. Still, the challenges will be onerous, even before the “dirty tricks” departments of the major parties start searching for skeletons in the closets of third party candidates (in a rare example of bipartisanship the Republicans and Democrats will probably do a bit of intelligence-sharing on that project) and the history is not encouraging.

It was the Republican party which in the 1850s was the last “third party” to make the transition to become a “major” and not since 1996 has such a candidate in a presidential contest secured more than 5% of the national vote.  In the Electoral College, not since 1968 has a third-party candidate carried any states and 1912 was the last time a third-party nominee finished second (and 1912 was a bit of a “special case” in which the circumstances were unusually propitious for challenges to the majors).  Still, with (1) the polls recording a general disillusionment with the major parties and institutions of state and (2) Mr Musk’s wealth able to buy much advertising and “other forms” of influence, prospects for a third party may be untypically bright in 2028 elections and 2030 mid-terms.  There are no more elections for Donald Trump (b 1946; US president 2017-2021 and since 2025) and it seems underestimated even now just what an aberration he is in the political cycle.  While his use of techniques and tactics from other fields truly has since 2016 been disruptive, what he has done is unlikely to be revolutionary because it is all so dependent on his presence and hands on the levers of power.  When he leaves office, without the “dread and awe” the implied threat of his displeasure evokes, business may return to something closer what we still imagine “normal” to be.

Wednesday, May 14, 2025

Psychache

Psychache (pronounced sahyk-eyk)

Psychological pain, especially when it becomes unbearable, producing suicidal thoughts.

1993: The construct was psyche- + ache.  Psychache was coined by US clinical psychologist Dr Edwin Shneidman (1918-2009) and first appeared in his book Suicide as Psychache: A Clinical Approach to Self-Destructive Behavior (1993).  The prefix psych- was an alternative form of psycho-.  Psycho was from the Ancient Greek ψχο- (psūkho-), a combining form of ψυχή (psukh) (soul).  Wit was used with words relating to the soul, the mind, or to psychology.  Ache was from the Middle English verb aken & noun ache (noun), from the Old English verb acan (from the Proto-West Germanic akan, from the Proto-Germanic akaną (to ache)) and the noun æċe (from the Proto-West Germanic aki, from the Proto-Germanic akiz), both from the primitive Indo-European heg- (sin, crime).  It was cognate with the Saterland Frisian eeke & ääke (to ache, fester), the Low German aken, achen & äken (to hurt, ache), the German Low German Eek (inflammation), the North Frisian akelig & æklig (terrible, miserable, sharp, intense), the West Frisian aaklik (nasty, horrible, dismal, dreary) and the Dutch akelig (nasty, horrible).  Historically the verb was spelled ake, and the noun ache but the spellings became aligned after Dr Johnson (Samuel Johnson (1709-1784)) published A Dictionary of the English Language (1755), the lexicographer mistakenly assuming it was from the Ancient Greek χος (ákhos) (pain) due to the similarity in form and meaning of the two words.  As a noun, ache meant “a continuous, dull pain (as opposed to a sharp, sudden, or episodic pain) while the verb was used to mean (1) to have or suffer a continuous, dull pain, (2) to feel great sympathy or pity and (3) to yearn or long for someone or something.  Pyscheache is a noun

Psychache is a theoretical construct used by clinical suicidologists and differs from psychomachia (conflict of the soul).  Psychomachia was from the Late Latin psӯchomachia, the title of a poem of a thousand-odd lines (circa 400) by Roman Christian poet Prudentius (Aurelius Prudentius Clemens; 348-circa 412), the construct being the Ancient Greek Greek psukhē (spirit) + makhē (battle).  The fifth century poem Psychomachia (translated usually as “Battle of Spirits” or “Soul War”) explored a theme familiar in Christianity: the eternal battle between virtue & vice (onto which can be mapped “right & wrong”, “good & evil” etc) and culminated in the forces of Christendom vanquishing pagan idolatry to the cheers of a thousand Christian martyrs.  An elegant telling of an allegory familiar in early Christian literature and art, Prudentius made clear the battle was one which happened in the soul of all people and thus one which all needed to wage, the outcome determined by whether the good or evil in them proved stronger.  The poem’s characters include Faith, Hope, Industry, Sobriety, Chastity, Humility & Patience among the good and Pride, Wrath, Paganism, Avarice, Discord, Lust & Indulgence in the ranks of the evil but scholars of literature caution that although the personifications all are women, in Latin, words for abstract concepts use the feminine grammatical gender and there’s nothing to suggest the poet intended us to read this as a tale of bolshie women slugging it out.  Of interest too is the appearance of the number seven, so familiar in the literature and art of Antiquity and the Medieval period as well as the Biblical texts but although Prudentius has seven virtues defeat seven vices, the characters don’t exactly align with either the canonical seven deadly sins, nor the three theological and four cardinal virtues.  In modern use, the linguistic similarity between psychache and psychomachia has made the latter attractive to those seduced by the (not always Germanic) tradition of the “romance of suicide”.

A pioneer in the field of suicidology, Dr Shneidman’s publication record was indicative of his specialization.

Dr Edwin Shneidman (1918-2009) was a clinical psychologist who practiced as a thanatologist (a practitioner in the field of thanatology (the scientific study of death and the practices associated with it, including the study of the needs of the terminally ill and their families); the construct of thanatology being thanato- (from the Ancient Greek θάνατος (thánatos) (death)) + -logy.  The suffix -ology was formed from -o- (as an interconsonantal vowel) + -logy.  The origin in English of the -logy suffix lies with loanwords from the Ancient Greek, usually via Latin and French, where the suffix (-λογία) is an integral part of the word loaned (eg astrology from astrologia) since the sixteenth century.  French picked up -logie from the Latin -logia, from the Ancient Greek -λογία (-logía).  Within Greek, the suffix is an -ία (-ía) abstract from λόγος (lógos) (account, explanation, narrative), and that a verbal noun from λέγω (légō) (I say, speak, converse, tell a story).  In English the suffix became extraordinarily productive, used notably to form names of sciences or disciplines of study, analogous to the names traditionally borrowed from the Latin (eg astrology from astrologia; geology from geologia) and by the late eighteenth century, the practice (despite the disapproval of the pedants) extended to terms with no connection to Greek or Latin such as those building on French or German bases (eg insectology (1766) after the French insectologie; terminology (1801) after the German Terminologie).  Within a few decades of the intrusion of modern languages, combinations emerged using English terms (eg undergroundology (1820); hatology (1837)).  In this evolution, the development may be though similar to the latter-day proliferation of “-isms” (fascism; feminism etc).

Death and the College Student: A Collection of Brief Essays on Death and Suicide by Harvard Youth (1973) by Dr Edwin Shneidman.  Dr Shneidman wrote many papers about the prevalence of suicide among college-age males, a cross-cultural phenomenon.

Dr Shneidman was one of the seminal figures in the discipline of suicidology, in 1968 founding the AAS (American Association of Suicidology) and the principal US journal for suicide studies: Suicide and Life-Threatening Behavior.  The abbreviation AAS is in this context used mostly within the discipline because (1) it is a specialized field and (2) there are literally dozens of uses of “AAS”.  In Suicide as Psychache: A Clinical Approach to Self-Destructive Behavior (1993) he defined psychache as “intense psychological pain—encompassing hurt, anguish, and mental torment”, identifying it as the primary motivation behind suicide, his theory being that when psychological pain becomes unbearable, individuals may perceive suicide as their only escape from torment.

Although since Suicide as Psychache: A Clinical Approach to Self-Destructive Behavior appeared in 1993 there have been four editions of American Psychiatric Association's (APA) Diagnostic and Statistical Manual of Mental Disorders (DSM), “psychache” has never appeared in the DSM.  That may seem an anomaly given much in the DSM revolves around psychological disturbances but the reason is technical.  What the DSM does is list and codify diagnosable mental disorders (depression, schizophrenia, bipolar disorder etc), classifying symptoms and behaviors into standardized categories for diagnosis and treatment planning.  By contrast, psychache is not a clinical diagnosis; it is a theoretical construct in suicidology which is used to explain the subjective experience of psychological pain that can lead to patients taking their own lives.  It thus describes an emotional state rather than a psychiatric disorder.

Lindsay Lohan and her lawyer in court, Los Angeles, December, 2011.

Despite that, mental health clinicians do actively use the principles of psychache, notably in suicide risk assessment and prevention and models have been developed including a number of “psychache scales”, self-reporting tools used to generate a metric measuring the intensity of psychological pain (categorized with headings such as shame, guilt, despair et al).  The approaches do in detail differ but most follow Dr Shneidman’s terminology in that the critical threshold is the point at which the patient’s pain becomes unbearable or inescapable and the objective is either to increase tolerance for distress or reframe troublesome thoughts.  Ultimately, the purpose of tools is to improve suicide risk assessments and reduce suicide rates.

DSM-5 (2013).

Interestingly, Suicidal Behavior Disorder (SBD) was introduced in Section III of the DSM-5 (2013) under “Conditions for Further Study”.  Then, SBD chiefly was characterized by a self-initiated sequence of behaviors believed at the time of initiation to cause one’s own death and occurring in the last 24 months.  That of course sounds exact but the diagnostic criteria in the DSM are written like that and the purpose of inclusion in the fifth edition was to create a framework so systematically, empirical studies related to SBD could be reviewed so primary research themes and promising directions for future research could be identified.  Duly, over the following decade that framework was explored but the conclusion was reached there seemed to be little utility in the clinical utility of SBD as a device for predicting future suicide and that more research was needed to understand measurement of the diagnosis and its distinctiveness from related disorders and other self-harming behaviors.  The phase “more research is required” must be one of the most frequently heard among researchers.

In the usually manner in which the APA allowed the DSM to evolve, what the DSM-5s tentative inclusion of SBD did was attempt to capture suicidality as a diagnosis rather than a clinical feature requiring attention.  SBD was characterized by a suicide attempt within the last 24 months (Criterion A) and that was defined as “a self-initiated sequence of behaviors by an individual who, at the time of initiation, expected that the set of actions would lead to his or her own death”.  That sounds uncontroversial but what was significant was the act could meet the criteria for non-suicidal self-injury (ie self-injury with the intention to relieve negative feelings or cognitive state in order to achieve a positive mood state (Criterion B) and cannot be applied to suicidal ideation or preparatory acts (Criterion C).  Were the attempt to have occurred during a state of delirium or confusion or solely for political or religious objectives, then SBD is ruled out (Criteria D & E).  SBD (current) is given when the suicide attempt occurred within the last 12 months, and SBD (in early remission), when it has been 12-24 months since the last attempt.  It must be remembered that while a patient’s behavior(s) may overlap across a number of the DSM’s diagnosises, the AMA’s committees have, for didactic purposes, always preferred to “silo” the categories.

DSM-5-TR (2022).

When in 2022 the “text revision” of the DSM-5 (DSM-5-TR) was released, SBD was removed as a condition for further study in Section III and moved to “Other Conditions That May Be a Focus of Clinical Attention” in Section II. The conditions listed in this section are intended to draw to attention of clinicians to the presence and breadth of additional issues routinely encountered in clinical practice and provide a procedure for their systematic documentation.  According to the APA’s editorial committee, the rationale for the exclusion of SBD from the DSM-5-TR was based on concerns the proposed disorder did not meet the criteria for a mental disorder but instead constituted a behavior with diverse causes and while that distinction may escape most of us, within the internal logic of the history of the DSM, that’s wholly consistent.  At this time, despite many lobbying for the adoption of a diagnostic entity for suicidal behavior, the APA’s committees seem still more inclined to conceptualize suicidality as a symptom rather than a disorder and despite discussion in the field of suicidology about whether suicide and related concepts like psychache should be treated as stand-alone mental health issues, that’s a leap which will have to wait, at least until a DSM-6 is published.

How to and how not to: Informatie over Zorgvuldige Levensbeëindiging (Information about the Careful Ending of Life, 2008) by Stichting Wetenschappelijk Onderzoek naar Zorgvuldige Zelfdoding (The Foundation for Scientific Research into Careful Suicide) (left) and How Not to Kill Yourself: A Phenomenology of Suicide (2023) by Clancy Martin (right).

Informatie over Zorgvuldige Levensbeëindiging (Information about the Careful Ending of Life, 2008) was published by a group of Dutch physicians & and researchers; it contained detailed advice on methods of suicide available to the general public, the Foundation for Scientific Research into Careful Suicide arging “a requirement exists within society for responsible information about an independent and dignified ending of life.”  It could be ordered only from the foundation’s website and had the advantage that whatever might be one’s opinion on the matter, it was at least written by physicians and scientists and thus more reliable than some of the “suicide guides” which are sometimes found on-line.  At the time research by the foundation had found that despite legislation in the Netherlands which permit doctors (acting within specific legal limits) to assist patient commit suicide, there were apparently several thousand cases each year of what it termed “autoeuthanasia” in which no medical staff directly were involved.  Most of these cases involved elderly or chronically ill patients who refused food and fluids and it was estimated these deaths happened at about twice the rate of those carried out under the euthanasia laws.  Since then the Dutch laws have been extended to included those who have no serious physical disease or are suffering great pain; there are people who simply no longer wish to live, something like the tragic figure in Blue Öyster Cult’s (Don't Fear) The Reaper (1976) © Donald Roeser (b 1947):

Came the last night of sadness
And it was clear she couldn't go on
Then the door was open and the wind appeared
The candles blew then disappeared
The curtains flew then he appeared
Saying don't be afraid

There is a diverse literature on various aspects of suicide (tips and techniques, theological & philosophical interpretations, cross-cultural attitudes, history of its treatment in church & secular law etc) and some are quite personal, written variously by those who later would kill themselves or those who contemplated or attempted to take their own lives.  In How Not to Kill Yourself: A Phenomenology of Suicide (2023) by Canadian philosopher Clancy Martin (b 1967), it was revealed the most recent of his ten suicide attempts was “…in his basement with a dog leash, the consequences of which he concealed from his wife, family, co-workers, and students, slipping back into his daily life with a hoarse voice, a raw neck and series of vague explanations.

BKA (the Bundeskriminalamt, the Federal Criminal Police Office of the FRG (Federal Republic of Germany (the old West Germany)) mug shots of the Red Army Faction's Ulrike Meinhof (left) and Gudrun Ensslin (right).

The song (Don't Fear) The Reaper also made mention of William Shakespeare's (1564–1616) Romeo and Juliet (1597) and in taking her own life (using her dead lover’s dagger) because she doesn’t want to go on living without him, Juliette joined the pantheon of figures who have made the tragedy of suicide seem, to some, romantic.  Politically too, suicide can grant the sort of status dying of old age doesn’t confer, the deaths of left-wing terrorists Ulrike Meinhof (1934–1976) and Gudrun Ensslin (1940–1977) of the West German Red Army Faction (the RAF and better known as the “Baader-Meinhof gang”) both recorded as “suicide in custody” although the circumstances were murky.  In an indication of the way moral relativities aligned during the high Cold War, the French intellectuals Jean-Paul Sartre (1905–1980) and Simone de Beauvoir (1908–1986) compared their deaths to the worst crimes of the Nazis but sympathy for violence committed for an “approved” cause was not the exclusive preserve of the left.  In July, 1964, in his speech accepting the Republican nomination for that year’s US presidential election, proto-MAGA Barry Goldwater (1909–1998) concluded by saying: “I would remind you that extremism in the defense of liberty is no vice!  And let me remind you also that moderation in the pursuit of justice is no virtue!  The audience response to that was rapturous although a few months later the country mostly didn’t share the enthusiasm, Lyndon Johnson (LBJ, 1908–1973; US president 1963-1969) winning the presidency in one of the greatest landslides in US electoral history.  Given the choice between crooked old Lyndon and crazy old Barry, Americans preferred the crook.

Nor was it just politicians and intellectuals who could resist the appeal of politics being taken to its logical “other means” conclusion, the Canadian singer-songwriter Leonard Cohen (1934-2016) during the last years of the Cold War writing First We Take Manhattan (1986), the lyrics of which were open to interpretation but clarified in 1988 by the author who explained: “I think it means exactly what it says.  It is a terrorist song.  I think it's a response to terrorism.  There's something about terrorism that I've always admired.  The fact that there are no alibis or no compromises.  That position is always very attractive.   Even in 1988 it was a controversial comment because by then not many outside of undergraduate anarchist societies were still romanticizing terrorists but in fairness to the singer the coda isn’t as often published: “I don't like it when it's manifested on the physical plane – I don't really enjoy the terrorist activities – but Psychic Terrorism.

First We Take Manhattan (1986) by Leonard Cohen

They sentenced me to twenty years of boredom
For tryin' to change the system from within
I'm coming now, I'm coming to reward them
First we take Manhattan, then we take Berlin
 
I'm guided by a signal in the heavens
I'm guided by this birthmark on my skin
I'm guided by the beauty of our weapons
First we take Manhattan, then we take Berlin
 
I'd really like to live beside you, baby
I love your body and your spirit and your clothes
But you see that line there moving through the station?
I told you, I told you, told you, I was one of those
 
Ah you loved me as a loser, but now you're worried that I just might win
You know the way to stop me, but you don't have the discipline
How many nights I prayed for this, to let my work begin
First we take Manhattan, then we take Berlin
 
I don't like your fashion business, mister
And I don't like these drugs that keep you thin
I don't like what happened to my sister
First we take Manhattan, then we take Berlin
 
I'd really like to live beside you, baby
I love your body and your spirit and your clothes
But you see that line there moving through the station?
I told you, I told you, told you, I was one of those



First We Take Manhattan performed by Jennifer Warnes (b 1947), from the Album Famous Blue Raincoat (1986). 

Whatever they achieved in life, it was their suicides which lent a lingering allure to German-American ecofeminist activist Petra Kelly (1947–1992) & the doomed poet American poet Sylvia Path (1932-1963) and the lure goes back for millennia, the Roman Poet Ovid (Publius Ovidius Naso; 43 BC–17 AD) in his Metamorphoses telling an ancient Babylonian tale in which Pyramus, in dark despair, killed herself after finding her young love lifeless.  Over the centuries it’s been a recurrent trope but the most novel take was the symbolic, mystical death in Richard Wagner's (1813–1883) Tristan und Isolde (1865).  Mortally wounded in a duel before the final act, Tristan longs to see Isolde one last time but just as she arrives at his side, he dies in her arms.  Overwhelmed by love and grief, Isolde sings the famous Liebestod (Love-Death) and dies, the transcendent aria interpreted as the swansong which carries her to join Tristan in mystical union in the afterlife.  This, lawyers would call a “constructive suicide”.

Austrian soprano Helga Dernesch (b 1939) in 1972 performing the Liebestod aria from Wagner’s Tristan und Isolde with the Berlin Philharmonic under Herbert von Karajan (1908–1989).

While she didn’t possess the sheer power of the greatest of the Scandinavian sopranos who in the mid-twentieth century defined the role, Dernesch brought passion and intensity to her roles and while, on that night in 1972, the lushness of what Karajan summoned from the strings was perhaps a little much, her Liebestod was spine-tingling and by then, Karajan had been forgiven for everything.  Intriguingly, although Tristan und Isolde is regarded as one of the great monuments to love, in 1854 Wagner had written to the Hungarian composer Franz Liszt (1811–1886) telling him:

As I have never in life felt the real bliss of love, I must erect a monument to the most beautiful of all my dreams, in which, from beginning to end, that love shall be thoroughly satiated.  I have in my head ‘Tristan and Isolde’, the simplest but most full-blooded musical concepion; with the ‘black flag’ which floats at the end of it I shall cover myself to die.

It’s not known whether Listz reflected on this apparent compositional self-medication for psychache after in 1870 learning from his morning newspaper his daughter Cosima (1837-1930) was to be married to Wagner (then 24 years her senior) but because she’d been for some seven years conducting an adulterous affair with the German the news may not have been unexpected.  He was aware Cosmia’s daughter (Isolde Beidler (1865–1919)) had been fathered not by her then husband (the German conductor Hans von Bülow (1830–1894)) but by Wagner and her second marriage proved happier than the first so there was that.

Sunday, May 4, 2025

Decalcomania

Decalcomania (pronounced dih-kal-kuh-mey-nee-uh or dih-kal-kuh-meyn-yuh)

(1) The process of transferring designs from specially prepared paper to cardboard, paper, wood, metal, china, glass etc.

(2) A design so transferred (always rare).

1864: From the French décalcomanie, the construct being décalc- (representing décalquer (to trace, transfer (a design)) the construct being dé- (in the sense of “off”) + calquer (to press) + the interfix “-o-” + -manie (–mania).  Decalcomania is a noun; the noun plural is decalcomanias (the plural in French was decalcomania).  Disappointingly, the noun decalcomaniac is non-standard.

The French prefix - partly was inherited from the Middle French des-, from the Old French des-, from a conflation of Latin dis- (apart) (ultimately from the primitive Indo-European dwís).  In English, the de- prefix was from the Latin -, from the preposition (of, from (the Old English æf- was a similar prefix)).  It imparted the sense of (1) reversal, undoing, removing, (2) intensification and (3) derived from; of off.  In French the - prefix was used to make antonyms (as un- & dis- function in English) and was partially inherited from the Old and Middle French des-, from the Latin dis- (part), the ultimate source being the primitive Indo-European dwís and partially borrowed from Latin dē-.  In English de- became a most active word-forming element, used with many verbs in some way gained French or Latin.  The frequent use in Latin as “down, down from, from, off; down to the bottom & totally (hence “completely” (intensive or completive)) came to be reflected in many English words.  As a Latin prefix it was used also to “undo” or “reverse” a verb's action; it thus came to be used as a pure privative (ie “not, do the opposite of, undo”) and that remains the predominant function as a living prefix in English such as defrost (1895 and a symbol of the new age of consumer-level refrigeration), defuse (1943 and thus obviously something encouraged by the sudden increase in live bombs in civilian areas which need the fuses to be removed to render them safe) and de-escalate (1964, one of the first linguistic contributions of the political spin related to the war in Vietnam).  In many cases, there is no substantive difference between using de- or dis- as a prefix and the choice can be simply one of stylistic preference.  Calquer (to press) was from the Italian calcare, from the Latin calcāre (to tread on; to press (that sense derived from calx (heel)).

The suffix –mania was from the Latin mania, from the Ancient Greek μανία (mania) (madness).  In modern use in psychiatry it is used to describe a state of abnormally elevated or irritable mood, arousal, and/or energy levels and as a suffix appended as required.  In general use, under the influence of the historic meaning (violent derangement of mind; madness; insanity), it’s applied to describe any “excessive or unreasonable desire; a passion or fanaticism” which can us used even of unthreatening behaviors such as “a mania for flower arranging, crochet etc”.  As a suffix, it’s often appended with the interfix -o- make pronunciation more natural.  The use of the suffix “-mania” in “decalcomania” may appear a curious use of an element in a word describing a process in graphical or decorative art given usually it’s appended to reference a kind of obsession or madness (kleptomania, bibliomania, megalomania etc) but here it’s used in a more abstract way.  The “-manie” in the French décalcomanie was used to suggest a fad or craze (the latter in the sense of something suddenly widely popular) and was not related to the way “mania” is used by mental health clinicians.  So, it was metaphorical rather than medical rather as “Tulipmania” came to be used of the seventeenth century economic bubble in the Netherlands which was centred on the supply of and demand for tulip bulbs.

TeePublic’s Lindsay Lohan decals (page one).

The noun decal (pronounced dee-kal or dih-kal) was in use by at least 1910 as a clipping of decalcomania, a process which came into vogue in France as early as the 1840s before crossing the channel, England taking up the trend in the early 1860s.  As a noun it referred to (1) the prepared paper (or other medium) bearing a image, text, design etc for transfer to another surface (wood, metal, glass, etc) or (2) the picture or design itself.  The verb (“to decal” and also as decaled or decaling) described the process of applying or transferring the image (or whatever) from the medium by decalcomania.  The noun plural is decals.  In the US, the word came to be used of adhesive stickers which could be promotional or decorative and this use is now common throughout the English speaking world.  The special use (by analogy) in computer graphics describes a texture overlaid atop another to provide additional detailing.

Variants of the transfer technique which came to be called decalcomania would for centuries have been used by artists before it became popularized in the mid-eighteenth century.  The method was simply to spread ink or paint onto a surface and, before the substances dried, it was covered with material such as such as paper, glass, or metallic foil, which, when removed, transferred the pattern which could be left in that form or embellished.  Originally the designs were deliberate but the innovation of the Surrealists was to create imagery by chance rather than conscious control of the materials.  The artistic merits of that approach can be discussed but young children have long taken to it like ducks to water, splashing colors on one side of a piece of paper and then folding it in half so, once pressed together, the shape is “mirrored”, creating what is called a “butterfly print”, something like the cards used in the Rorschach tests.

Although an ancient practice, it is French engraver Simon François Ravenet (1706–circa 1774) who is crediting with give the technique its name because he called it décalquer (from the French papier de calque (tracing paper) and this coincided with painters in Europe experimenting with ink blots to add “accidental” forms of expression into their work.  Ravenet spent years working in England (where usually he was styled Simon Francis Ravenet) and was influential in the mid century revival of engraving although it was in ceramics decalcomania first became popular although the word didn’t come into wide use until adopted by the Spanish-born French surrealist Óscar Domínguez (1906–1957).  It was perhaps the German Dadaist and Surrealist Max Ernst (1891–1976) who more than most exemplified the possibilities offered decalcomania and it was US philosopher turned artist Robert Motherwell (1915–1991) who said of him: “Like every consequential modern painter, Max Ernst has enforced his own madness on the world.  Motherwell was of the New York School (which also included the Russian-born Mark Rothko (1903–1970), drip painter Jackson Pollock (1912-1956) and the Dutch-American Willem de Kooning (1904–1997)) so he was no stranger to the observation of madness.  Condemned by the Nazis variously as an abstractionist, modernist, Dadaist and Surrealist, Ernst fled to Paris and after the outbreak of World War II (1939-1945) he was one of a number of artistic and political figures who enjoyed the distinction of being imprisoned by both the French and the Gestapo; it was with the help of US art patron and collector Peggy Guggenheim (1898–1979) he in 1941 escaped Vichy France and fled to the US.

That “help” involved their marriage, hurriedly arranged shortly after the pair landed in New York but although in the technical sense a “marriage of convenience”, she does seem genuinely to have been fond of Ernst and some romantic element wasn’t entirely absent from their relationship although it’s acknowledged it was a “troubled” marriage. A divorce was granted in 1946 but artistically, she remained faithful, his work displayed prominently in her New York gallery (Art of This Century (1942–1947)), then the city’s most significant centre of the avant-garde.  Through this exposure, although he never quite became integrated into the (surprisingly insular) circle of abstract expressionists, Ernst not only became acquainted with the new wave of American artists but contributed also to making European modernism familiar to Americans at a time when the tastes of collectors (and many critics) remained conservative.  He was an important element in her broader mission to preserve and promote avant-garde art despite the disruption of war.  So, the relationship was part patronage and part curatorial judgment and historians haven’t dwelt too much on the extent it was part love; even after their divorce, Guggenheim continued to collect pieces by Ernst and they remain in her famous “Venice Collection” at the Palazzo Venier dei Leoni.  As a wife she would have had opinions of her husband but as a critic she also classified and never said of Ernst as she said of Pollock: “...the greatest painter since Picasso.

Untitled (1935), Decalcomania (ink transfer) on paper by André Breton.

For Ernst, the significance of decalcomania was not its utility as a tool of production (as it would appeal to graphic artists and decal-makers) but as something which would result in a randomness to excite his imagination.  What he did was use the oil paint as it ended up on canvas after being “pressed” as merely the starting point, onto which he built elements of realism, suggesting often mythical creatures in strange, unknown places but that was just one fork of decalcomania, Georges Hugnet (1906–1974) rendering satirical images from what he found while André Breton (1896–1966 and a “multi-media” figure decades before term emerged) used the technique to hone surrealism, truly decalcomania’s native environment.

Decalcomania in psychiatry and art: Three of the ink-blot cards (top row) included by Swiss psychiatrist Hermann Rorschach (1885-1922) in his Rorschach Test (1927), a projective psychological tool in which subjects' perceptions of inkblots are recorded and then analyzed with psychological interpretation or historical statistical comparison (and now, also AI (artificial intelligence)) and three images from the Pornographic Drawing series by Cornelia Parker (bottom row).

Nor has decalcomania been abandoned by artists, English installation specialist Cornelia Parker (b 1956) producing drawings which overlaid contemporary materials onto surfaces created with the decalcomania process, the best known of which was the series Pornographic Drawing (1996) in which an inky substance extracted from pornographic film material was applied to paper, folded in half and opened again to reveal the sexualised imagery which emerged through the intervention of chance.  Although it’s speculative, had Ms Parker’s work been available and explained to the Nazi defendants at the first Nuremberg Trial (1945-1946) when they were considering the Rorschach Test cards, their responses would likely have been different.  Rudolf Hess (1894–1987; Nazi Deputy Führer 1933-1941) would have been disgusted and become taciturn while Julius Streicher (1885–1946; Nazi Gauleiter of Franconia 1929-1940) would have been stimulated to the point of excitement.

Europe after the Rain II, 1940-1942 (Circa 1941), oil on canvas by Max Ernst.

Regarded as his masterpiece, Europe after the Rain II (often sub-titled “An Abstract, Apocalyptic Landscape”) was intended to evoke feelings of despair, exhaustion, desolation and a fear of the implications of the destructive power of modern, mechanized warfare.  It was a companion work to an earlier to the earlier Europe after the Rain I, (1933), sculpted from plaster and oil on plywood in which Ernst built on a decalcomania base to render an imaginary relief map of Europe.  It was in 1933 Adolf Hitler (1889-1945; Führer (leader) and German head of government 1933-1945 & head of state 1934-1945) gained power in Germany.

Europe after the Rain I, (1933), oil & plaster on plywood by Max Ernst.

Even the physical base of Europe After the Rain I was a piece of surrealist symbolism, the plywood taken from the stage sets used for the film L'Âge d'or (1930) (The Age of Gold or the Golden Age depending on the translator's interpretation).  Directed by Spaniard Luis Buñuel (1900-1983), L'Âge d'or was a film focused on the sexual mores of bourgeois society and a critique of the hypocrisies and contradictions of the Roman Catholic Church's clerical establishment.  While one of France's first "sound films", it was, as was typical during what was a transitional era, told mostly with the use of title cards, the full-screen explanatory texts which appeared between scenes.

Snow Flowers (1929) oil on canvas by means of frottage & grattage by Max Ernst.

Technically, Ernst was an innovator in Decalcomania, in 1925 using the technique of frottage (laying a sheet of paper over a textured surface and rubbing it with charcoal or graphite).  The appeal of this was it imparted the quality of three dimensionality and Ernst liked textured surfaces as passages in a larger composition.  He also employed grattage (frottage’s sister technique) in which an object is placed under a piece of paper, which is then covered with a thin layer of pigment and once the pigment is scraped off, what is revealed is a colorful imprint of the object and its texture.

1969 Chrysler (Australia) VF Valiant Pacer 225 (left), 1980 Porsche 924 Turbo (centre) and cloisonné Scuderia Ferrari fender shield on 1996 Ferrari F355 Spider (right).

There was a time when decals on cars were, by some, looked down upon because they were obviously cheaper than badges made of metal.  That attitude changed for a number of reasons including their use on sexy, high-performance cars, the increasing use of decals on race cars after advertising became universally permitted after 1968 and the advent of plastic badges which, being cheaper to produce and affix, soon supplanted metal on all but the most expensive vehicles.  By the mid 1970s, even companies such as Porsche routinely applied decals and the Scuderia Ferrari fender shield, used originally on the cars run by the factory racing team, became a popular after-market accessory and within the Ferrari community, there was a clear hierarchy of respectability between thin, “stuck on” printed decals and the more substantial cloisonné items.

A video clip explaining why a Scuderia Ferrari fender shield costs US$14,000 if it's painted in the factory.

However, many of the cloisonné shields were non-authentic (ie not a factory part number), even the most expensive selling for less than US$1000 and there was no obvious way to advertise one had a genuine “made in Maranello” item.  Ferrari’s solution was to offer as a factory option a form of decalcomania, hand-painted by an artisan in a process said to take about eight hours.  To reassure its consumers (keen students of what the evil Montgomery Burns (of The Simpsons TV cartoon series) calls “price taggery”), the option is advertised (depending on the market) at around US$14,000.