Showing posts with label DSM. Show all posts
Showing posts with label DSM. Show all posts

Monday, June 2, 2025

Asperger

Asperger (pronounced a-spuh-guh or a-spr-gr)

(1) In neo-paganism and modern witchcraft, a ceremonial bundle of herbs or a perforated object used to sprinkle water (in spells as “witches water”), usually at the commencement of a ritual.

(2) In neurology, as Asperger's syndrome (less commonly Asperger syndrome), an autism-related developmental disorder characterised by sustained impairment in social interaction and non-verbal communication and by repetitive behaviour as well as restricted interests and routines.  The condition was named after Austrian pediatrician Hans Asperger (1906–1980).

Pre-1300: The surname Asperger was of German origin and was toponymic (derived from a geographical location or feature).  The town of Asperg lies in what is now the district of Ludwigsburg, Baden-Württemberg, in south-west Germany and in German, appending the suffix “-er” can denote being “from a place”, Asperger thus deconstructs as “someone from Asperg” and in modern use would suggest ancestral ties to the town of Asperg or a similar-sounding locality.  Etymologically, Asperg may be derived from older Germanic or Latin roots, possibly meaning “rough hill” or “stony mountain” (the Latin asper meaning “rough” and the German berg meaning “mountain or hill”.  The term “Asperger’s syndrome” was in 1976 coined by English psychiatrist Lorna Wing (1928–2014), acknowledging the work of Austrian pediatrician Hans Asperger (1906–1980).  Dr Wing was instrumental in the creation of the National Autistic Society, a charity which has operated since 1962.  Asperger is a noun (capitalized if in any context used as a proper noun).  Aspergerian & Aspergic are nouns; the noun plural forms being Aspergers, Aspergerians & Aspergics.  In the literature, Aspergerian & Aspergic (of, related to, or having qualities similar to those of Asperger's syndrome (adjective) & (2) someone with Asperger's syndrome (noun)) appear both to have been used.  In general use “Asperger's” was the accepted ellipsis of Asperger's syndrome while the derogratory slang forms included Aspie, autie, aspie, sperg, sperglord & assburger, now all regarded as offensive in the same way “retard” is now proscribed.

The noun asperges described a sprinkling ritual of the Catholic Church, the name was applied also to an antiphon intoned or sung during the ceremony.  It was from the Late Latin asperges, noun use of second-person singular future indicative of aspergere (to scatter, strew upon, sprinkle), the construct being ad (to, towards, at) + spargere (to sprinkle).  The use in Church Latin was a learned borrowing from Latin aspergō (to scatter or strew something or someone; to splash over; to spot, stain, sully, asperse; besmirch; (figuratively) to bestow, bequeath something to, set apart for) the construct being ad- +‎ spargō (strew, scatter; sprinkle; moisten).  The origin lay in the phrase Asperges me, Domine, hyssopo et mundabor (Thou shalt sprinkle me, O Lord, with hyssop, and I shall be cleansed), from the 51st Psalm (in the Vulgate), sung during the rite of sprinkling a congregation with holy water.  Hyssop (any of a number of aromatic bushy herbs) was from the Latin hȳsōpum, from the Ancient Greek ὕσσωπος (hússōpos), of Semitic origin and the idea was would be cleansed of one’s sins.  In the Old English the loan-translation of the Latin aspergere was onstregdan.

The three most recent popes demonstrate their aspergillum (also spelled aspergill) technique while performing the sprinkling rite.  In the more elaborate rituals, it's often used in conjunction with a container called an aspersorium (holy water bucket).  Benedict XVI (1927–2022; pope 2005-2013, pope emeritus 2013-2022, left), Francis (1936-2025; pope 2013-2025, centre) and Leo XIV (b 1955; pope since 2025, right).

In the Christian liturgy, an aspergillum was used to sprinkle holy water and the borrowing, adaptation and re-purposing of ceremonies, feasts days and such from paganism widely was practiced by the early Church.  In the Bible (notably chapter 14 in the Old Testament’s Book of Leviticus) there are descriptions of purification rituals involving the use of cedar wood, hyssop, and scarlet wool to create an instrument for sprinkling blood or water and historians sometimes cite this as “proto-aspergillum”.  While it seems the earliest known use on English of “aspergillum” dates from 1649, the documentary evidence is clear the practice in the Christian liturgy was ancient and common since at least the tenth century.  Exactly when the ritualistic practice began isn’t known but because water is so obviously something used “to cleanse”, it’s likely it has been a part of religious rituals for millennia before Christianity.

The use of the “asperger” in neo-paganism & witchcraft was a continuation of the concept and well documented in the remarkably prolific literature (some book shops have dedicated sections) devoted to modern witchcraft and the construction of the objects (a bundle of fresh herbs or a perforated object for sprinkling water) is a lineal descendent of the aspergillum of the Medieval church and that makes sense, both institutions devoted to the process of cleansing although the targets may have differed.  According to Ancient Pathways Witchcraft (which sounds an authoritative source), although it’s the fluid which does the cleansing, the asperger is significant because it symbolizes “the transformative and cleansing properties of water…”, rinsing away “…spiritual debris that might interfere with the sanctity of rituals.  In both neo-paganism and witchcraft, the herbs used may vary and while, pragmatically, sometimes this was dictated by seasonal or geographical availability, priests and witches would also choose the composition based on some “unique essences” being better suited to “enhance the sacred water's effectiveness”.  Nor were herbs always used for, as in the rituals of the church, “an asperger might be a metal or wooden rod designed with perforations or an attached mesh”, something like a “small brush or a dedicated holy water sprinkler akin to those seen in Christian liturgy.  Again, it was the sprinkling of the water which was the critical element in the process, the devices really delivery systems which, regardless of form, existed to transform simple water into “a divine medium of purity and transformation.  That said, their history of use did vest them with tradition, especially when certain herbs were central to a spell.

Dr Hans Asperger at work, Children's Clinic, University of Vienna, circa 1935.

The term “Asperger’s syndrome” first appeared in a paper by English psychiatrist Lorna Wing (1928–2014) although use seems not to have entered the medical mainstream until 1981.  Dr Wing (who in 1962 was one of the founders of the charitable organization the National Autistic Society) named it after Austrian pediatrician Hans Asperger (1906–1980) who first described the condition in 1944, calling it autistischen psychopathen (autistic psychopathy).  Dr Wing was instrumental in the creation of the National Autistic Society, a charity which has operated since 1962.  The German autistischen was an inflection of autistisch (autistic), the construct being Autist (autistic) +‎ -isch (an adjectival suffix).

The English word autism was from the German Autismus, used in 1913 by Swiss psychiatrist and eugenicist Eugen Bleuler (1857-1939), the first known instance dating from 1907 and attributed by Swiss psychiatrist & psychotherapist Carl Jung (1875-1961) as an alternative to his earlier “auto-erotism” although in his book Dementia Praecox, oder Gruppe der Schizophrenien (Precocious Dementia, or Group of Schizophrenias, 1911) Bleuler differentiated the terms.  The construct of the word was the Ancient Greek αὐτός (autos) (self) + -ισμός (-ismós) (a suffix used to form abstract nouns of action, state or condition equivalent to “-ism”).  Being a time of rapid advances in the relatively new discipline of psychiatry, it was a time also of linguistic innovation, Dr Bleuler in a Berlin lecture in 1908 using the term “schizophrenia”, something he’d been using in Switzerland for a year to replace “dementia praecox”, coined by German psychiatrist Emil Kraepelin's (1856-1926).  What Dr Bleuler in 1913 meant by “autistic” was very different from the modern understanding in that to him it was a symptom of schizophrenia, not an identifiably separate condition.  In the UK, the profession picked this up and it was used to describe “a tendency to turn inward and become absorbed in one's own mental and emotional life, often at the expense of connection to the external world” while “autistic thinking” referred to those who were “self-absorbed, fantasy-driven, and detached from reality; thinking patterns, commonly seen in those suffering schizophrenia.

Looking Up was the monthly newsletter of the International Autism Association and in Volume 4, Number 4 (2006), it was reported Lindsay Lohan’s car had blocked the drop-off point for Smashbox Cares, a charity devoted to teaching surfing to autistic youngsters.  Arriving at the designated spot at Malibu’s Carbon Beach, the volunteers were delayed in their attempt to disembark their charges, something of significance because routine and predictability is important to autistic people.  To make up for it, Ms Lohan staged an impromptu three hour beach party for the children, appearing as a bikini-clad DJ.  Apparently, it was enjoyed by all.

The modern sense of “autistic” began to emerge in the 1940s, among the first to contribute the Austrian-American psychiatrist Leo Kanner (1894–1981) who in 1943 published a paper using the phrase “early infantile autism” to describe a distinct syndrome (which now would be understood as autism spectrum disorder).  The following year, in Vienna, Dr Asperger wrote (seemingly influenced by earlier work in Russia) of his observational studies of children, listing the behaviors he associated with the disorder and unlike some working in the field during the 1940s, Dr Asperger wasn’t wholly pessimistic about his young patients, writing in Autistic Psychopathy in Childhood (1944): “The example of autism shows particularly well how even abnormal personalities can be capable of development and adjustment. Possibilities of social integration which one would never have dreamt of may arise in the course of development.  Many of the documents associated with Dr Asperger’s work were lost (or possibly taken to the Soviet Union) in the chaotic last weeks of World War II (1939-1945) and it wasn’t until Dr Wing in the 1970s reviewed some material from the archives that his contributions began to be appreciated although not until 1992 did “Asperger’s Syndrome” became a standard diagnosis.

DSM IV (1994).  Not all in the profession approved of the reclassification of Asperger’s syndrome under the broader Autism Spectrum Disorder, believing it reduced the depth of diagnostic evaluation, flattened complexity and was disconnected from clinical reality.  There was also regret about structural changes, DSM-5 eliminating the multiaxial system (Axes I–V), which some clinicians found useful for organizing information about the patient, especially Axis II (personality disorders) and Axis V (Global Assessment of Functioning).

Asperger’s Syndrome first appeared in the American Psychiatric Association's (APA) classification system when it was added to the fourth edition of the Diagnostic and Statistical Manual of Mental Disorders (DSM-IV, 1994) and the utility for clinicians was it created a sub-group of patients with autism but without a learning disability (ie characterized by deficits in social interaction and restricted interests, in the absence of significant language delay or cognitive impairment), something with obvious implications for treatment.  In the DSM-5 (2013), Autism Spectrum Disorder (ASD) was re-defined as a broader category which combined Asperger syndrome, Autistic Disorder & PDD-NOS (Pervasive Developmental Disorder Not Otherwise Specified) into a single ASD diagnosis, the editors explaining the change as a reflection of an enhanced understanding of the condition, the emphasis now on it being something with varying degrees of severity and presentation rather than distinct types.

However, although after 2013 the term no longer appeared in the DSM, it has remained in popular use, the British military historian Sir Antony Beevor (b 1946) in Ardennes 1944 (2015, an account of the so-called "Battle of the Bulge") speculating of Field Marshal Bernard Montgomery (First Viscount Montgomery of Alamein, 1887–1976) that "one might almost wonder whether [he] suffered from what today would be called high-functioning Asperger syndrome.The eleventh release of the World Health Organization’s (WHO) International Classification of Diseases (ICD) (ICD-11) aligned with the DSM-5 and regards what once would have been diagnosed as Asperger’s Syndrome to be deemed a relatively mild manifestation of ASD.  The diagnostic criteria for ASD focus on deficits in social communication and interaction, as well as repetitive behaviors and interests.  Although no longer current, the DSM IV’s criteria for Asperger's Disorder remain of interest because while the label is no longer used, clinicians need still to distinguish those in the spectrum suffering some degree of learning disability and those not so affected:

DSM-IV diagnostic criteria for Asperger’s Disorder (299.80).

A. Qualitative impairment in social interaction, as manifested by at least two of the following:

(1) marked impairments in the use of multiple nonverbal behaviors such as eye-to-eye gaze, facial expression, body postures, and gestures to regulate social interaction.

(2) failure to develop peer relationships appropriate to developmental level.

(3) a lack of spontaneous seeking to share enjoyment, interests, or achievements with other people (eg by a lack of showing, bringing, or pointing out objects of interest to other people).

(4) lack of social or emotional reciprocity.

B. Restricted repetitive and stereotyped patterns of behavior, interests, and activities, as manifested by at least one of the following:

(1) encompassing preoccupation with one or more stereotyped and restricted patterns of interest that is abnormal either in intensity or focus.

(2) apparently inflexible adherence to specific, non-functional routines or rituals.

(3) stereotyped and repetitive motor mannerisms (eg hand or finger flapping or twisting, or complex whole-body movements).

(4) persistent preoccupation with parts of objects.

C. The disturbance causes clinically significant impairment in social, occupational, or other important areas of functioning

D. There is no clinically significant general delay in language (eg single words used by age 2 years, communicative phrases used by age 3 years).

E. There is no clinically significant delay in cognitive development or in the development of age-appropriate self-help skills, adaptive behavior (other than social interaction), and curiosity about the environment in childhood.

F. Criteria are not met for another specific Pervasive Developmental Disorder or Schizophrenia.

The term in the twenty-first century became controversial after revelations of some of Dr Asperger's activities during the Third Reich (Austria annexed by Germany in 1938) which included his clinic in Vienna sending selected children to be victims of Aktion T4 (a mass-murder programme of involuntary euthanasia targeting those with disabilities), an operation which ran at times in parallel with the programmes designed to exterminate the Jews, Gypsies, homosexuals and others.  While there is no surviving documentary evidence directly linking Dr Asperger to the selection process which decided which children were to be killed, researchers have concluded the records suggest his construction of what came later to be called “Asperger’s syndrome” was actually that very process with an academic gloss.  Because those Dr Asperger so categorized were the autistic children without learning difficulties, they were thus deemed capable of being “cured” and thus spared from the T4’s lists, unlike the “uneducable” who would never be able to be made into useful German citizens.  While the surviving material makes clear Dr Asperger was at least a “fellow traveller” with the Nazi regime, in professional, artistic and academic circles there was nothing unusual or even necessarily sinister about that because in a totalitarian state, people have few other choices if they wish to avoid unpleasantness.  However, it does appear Dr Asperger may have been unusually co-operative with the regime and his pre-1945 publication record suggests sympathy with at least some aspects of the Nazis’ racial theories and eugenics.

Friday, May 30, 2025

Tatterdemalion

Tatterdemalion (pronounced tat-er-di-meyl-yuhn or tat-er-di-mal-yuhn)

(1) A person in tattered clothing; a shabby person.

(2) Ragged; unkempt or dilapidated.

(3) In fashion, (typically as “a tatterdemalion dress” etc), garments styled deliberately frayed or with constructed tears etc (also described as “distressed” or “destroyed”).

(4) A beggar (archaic).

1600–1610: The original spelling was tatter-de-mallian (the “demalion” rhymed with “Italian” in English pronunciation), the construct thus tatter + -demalion, of uncertain origin although the nineteenth century English lexicographer Ebenezer Cobham Brewer (1810-1897) (remembered still for his marvelous Dictionary of Phrase and Fable (1894) suggested it might be from de maillot (shirt) which does seem compelling.  Rather than the source, tatter is thought to have been a back-formation from tattered, from the Middle English tatered & tatird, from the Old Norse tǫturr.  Originally, it was derived from the noun, but it was later re-analysed as a past participle (the construct being tatter + -ed) and from this came the verb.  As a noun a tatter was "a shred of torn cloth or an individual item of torn and ragged clothing" while the verb implied both (as a transitive) "to destroy an article of clothing by shredding" & (as an intransitive) "to fall into tatters".  Tatterdemalion is a noun & adjective and tatterdemalionism is a noun; the noun plural is tatterdemalions.

In parallel, there was also the parallel "tat", borrowed under the Raj from the Hindi टाट (ā) (thick canvas) and in English it assumed a variety of meanings including as a clipping of tattoo, as an onomatopoeia referencing the sound made by dice when rolled on a table (and came to be used especially of a loaded die) and as an expression of disapprobation meaning “cheap and vulgar”, either in the context of low-quality goods or sleazy conduct.  The link with "tatty" in the sense of “shabby or ragged clothing” however apparently comes from tat as a clipping of the tatty, a woven mat or screen of gunny cloth made from the fibre of the Corchorus olitorius (jute plant) and noted for it loose, scruffy-looking weave.  Tatterdemalion is a noun & adjective; the noun plural is tatterdemalions.  The historic synonyms were shoddy, battered, broken, dilapidated, frayed, frazzled, moth-eaten, ragged, raggedy, ripped, ramshackle, rugged, scraggy, seedy, shabby, shaggy, threadbare, torn & unkempt and in the context of the modern fashion industry, distressed & destroyed.  An individual could also be described as a tramp, a ragamuffin, a vagabond, a vagrant, a gypsy or even a slum, some of those term reflecting class and ethnic prejudice or stereotypes.  Historically, tatterdemalion was also a name for a beggar.

A similar word in Yiddish was שמאַטע‎ (shmate or shmatte and spelled variously as schmatte, schmata, schmatta, schmate, schmutter & shmatta), from the Polish szmata, of uncertain origin but possibly from szmat (a fair amount).  In the Yiddish (and as adopted in Yinglish) it meant (1) a rag, (2) a piece of old clothing & (3) in the slang of the clothing trade, any item of clothing.  That was much more specific than the Polish szmata which meant literally "rag or old, ripped piece of cloth" but was used also figuratively to mean "publication of low journalistic standard" (ie analogous the English slang use of "rag") and in slang to refer to a woman of loose virtue (used as skank, slut etc might be used in English), a sense which transferred to colloquial use in sport to mean "simple shot", "easy goal" etc.

Designer distress: Lindsay Lohan illustrates the look.

Tatterdemalion is certainly a spectrum condition (the comparative “more tatterdemalion”; the superlative “most tatterdemalion”) and this is well illustrated by the adoption of the concept by fashionistas, modern capitalism soon there to supply demand.  In the fashion business, tatterdemalion needs to walk a fine line because tattiness was historically associated with poverty while designers need to provide garments which convey a message wealth.  The general terms for such garments is “distressed” although “destroyed” is (rather misleadingly) also used.

Highly qualified porn star Busty Buffy (b 1996) in “cut-off” denim shorts with leather braces while beltless.

The ancestor of designer tatterdemalion was a pair of “cut off” denim shorts, improvised not as a fashion statement but as a form of economy, gaining a little more life from a pair of jeans which had deteriorated beyond the point where mending was viable.  Until the counter-culture movements of the 1960s (which really began the previous decade but didn’t until the 1960s assume an expression in mass-market fashion trends), wearing cut-off jeans or clothing obviously patched and repaired generally was a marker of poverty although common in rural areas and among the industrial working class where it was just part of life.  It was only in the 1960s when an anti-consumerist, anti materialist vibe attracted the large cohort of youth created by the post-war “baby boom” that obviously frayed or torn clothing came to be an expression of disregard or even disdain for the prevailing standards of neatness (although paradoxically they were the richest “young generation” ever).  It was the punk movement in the 1970s which took this to whatever extremes seemed possible, the distinctive look of garments with rips and tears secured with safety pins so emblematic of (often confected) rebellion that in certain circles it remains to this day part of the “uniform”.  The fashion industry of course noted the trend and what would later be called “distressed” denim appeared in the lines of many mainstream manufacturers as early as the 1980s, often paired with the acid-washing and stone-washing which previously had been used to make a pair of jeans appear “older”, sometimes a desired look.

Dolce & Gabbana Distressed Jeans (part number FTCGGDG8ET8S9001), US$1150.

That it started with denim makes sense because it's the ultimate "classless" fabric in that it's worn by both rich and poor and while that has advantages for manufacturers, it does mean some are compelled to find ways to ensure buyers are able (blatantly or with some subtlety) to advertise what they are wearing is expensive; while no fashion house seems yet to have put the RRP (recommended retail price) on a leather patch, it may be only a matter of time.  The marketing of jeans which even when new gave the appearance of having been “broken in” by the wearer was by the 1970s a define niche, the quasi-vintage look of “fade & age” achieved with processes such as stone washing, enzyme washing, acid washing, sandblasting, emerizing and micro-sanding but this was just to create an effect, the fabrics not ripped or torn.  Distressed jeans represented the next step in the normal process of wear, fraying hems and seams, irregular fading and rips & tears now part of the aesthetic.  As an industrial process that’s not difficult to do but if done in the wrong way it won’t resemble exactly a pair of jeans subject to gradual degradation because different legs would have worn the denim at different places.  In the 2010s, the look spread to T-shirts and (predictably) hoodies, some manufacturers going beyond mere verisimilitude to (sort of) genuine authenticity, achieving the desired decorative by shooting shirts with bullets, managing a look which presumably the usual tricks of “nibbling & slashing” couldn’t quite emulate.  Warming to the idea, the Japanese label Zoo released jeans made from material torn by lions and tigers, the company anxious to mention the big cats in Tokyo Zoo seemed to "enjoy the fun" and to anyone who has seen a kitten with a skein of wool, that will sound plausible.  Others emulated the working-class look, the “caked-on muddy coating and “oil and grease smears” another variant although one apparently short-lived; appearing dirty apparently never a fashionable choice.  All these looks had of course been seen for centuries, worn mostly by the poor with little choice but to eke a little more wear from their shabby clothes but in the late twentieth century, as wealth overtook Western society, the look was adopted by many with disposable income; firstly the bohemians, hippies and other anti-materialists before the punk movement which needed motifs with some capacity to shock, something harder to achieve than had once been the case.

Distressed top and bottom.  Gigi Hadid (b 1995) in distressed T-shirt and "boyfriend" jeans.

For poets and punks, improvising the look from the stocks of thrift shops, that was fine but for designer labels selling scruffy-looking jeans for four-figure sums, it was more of a challenge, especially as the social media generation had discovered that above all they liked authenticity and faux authenticity would not do, nobody wanting to look it to look they were trying too hard.  The might have seemed a problem, given the look was inherently fake but the aesthetic didn’t matter for its own sake, all that had to be denoted was “conspicuous consumption” (the excessive spending on wasteful goods as proof of wealth) and the juxtaposition of thousand dollar distressed jeans with the odd expensive accessory, achieved that and more, the discontinuities offering irony as a look.  The labels, the prominence of which remained a focus was enough for the message to work although one does wonder if any of the majors have been tempted to print a QR code on the back pocket, linked to the RRP because, what people are really trying to say is “My jeans cost US$1200”.

1962 AC Shelby American Cobra (CSX 2000), interior detail, 2016.

The value of selective scruffiness is well known in other fields.  When selling a car, usually a tatty interior greatly will depress the price (sometimes by more even than the cost of rectification).  However, if the tattiness is of some historic significance, it can add to car’s value, the best example being if the deterioration is part of a vehicle's provenance and proof of originality, a prized attribute to the segment of the collector market known as the “originally police”.  In 2016, what is recognized as the very first Shelby American AC Cobra (CSX 2000) sold for US$13.75 million, becoming the highest price realized at auction for what is classified as "American car".  Built in 1962, it was an AC Ace shipped to California without an engine (and apparently not AC's original "proof-of-concept" test bed which was fitted with one of the short-lived 221 cubic inch (3.6 litre) versions of Ford's new "thin-wall" Windsor V8) where the Shelby operation installed a 260 cubic inch (4.2 litre) Windsor and the rest is history.  The tatterdemalion state of the interior was advertised as one of the features of the car, confirming its status as “an untouched survivor”.  Among Cobra collectors, patina caused by Carroll Shelby's (1923–2012) butt is a most valuable tatterdemalion.

Patina plus and beyond buffing out: Juan Manuel Fangio, Mercedes-Benz W196R Stromlinienwagen (Streamliner), British Grand Prix, Silverstone, 17 July 1954.

Also recommended to be repaired before sale are dents, anything battered unlikely to attract a premium.  However, if a dent was put there by a Formula One (F1) world champion, it becomes a historic artefact.  In 1954, Mercedes-Benz astounded all when their new grand prix car (the W196R) appeared with all-enveloping bodywork, allowed because of a since closed loophole in the rule-book.  The sensuous shape made the rest of the field look antiquated although underneath it was a curious mix of old and new, the fuel-injection and desmodromic valve train representing cutting edge technology while the swing axles and drum brakes spoke to the past and present, the engineers’ beloved straight-eight configuration (its last appearance in F1) definitely the end of an era.  On fast tracks like Monza, the aerodynamic bodywork delivered great speed and stability but the limitations were exposed when the team ran the Stromlinienwagen at tighter circuits and in the 1954 British Grand Prix at Silverstone, Juan Manuel Fangio (1911–1995; winner of five F1 world-championship driver's titles) managed to clout a couple of oil-drums (those and bails of hay how track safety was then done) because it was so much harder to determine the extremities without being able to see the front wheels.  Quickly, the factory concocted a functional (though visually unremarkable) open-wheel version and the sleek original was thereafter used only on the circuits where the highest speeds were achieved.  In 1954, the factory was unconcerned with the historic potential of the dents and repaired the tatterdemalion W196R so an artefact of the era was lost.  That apart, as used cars the W196s have held their value well, an open-wheel version selling at auction in 2013 for US$29.7 million while in 2025 a Stromlinienwagen realized US$53.9 million.  

1966 Ferrari 330 GTC (1966-1968) restored by Bell Sport & Classic.  Many restored Ferraris of the pre-1973 era are finished to a much higher standard than when they left the showroom.  Despite this, genuine, original "survivors" (warts and all) are much-sought in some circles.

In the collector car industry, tatterdemalion definitely is a spectrum condition and for decades the matter of patina versus perfection has been debated.  There was once the idea that in Europe the preference was for a vehicle to appear naturally aged (well-maintained but showing the wear of decades of use) while the US market leaned towards cars restored to the point of being as good (or better) than they were on the showroom floor.  Social anthropologists might have some fun exploring that perception of difference and it was certainly never a universal rule but the debate continues, as does the argument about “improving” on the original.  Some of the most fancied machinery of the 1950s and 1960s (notably Jaguars, Ferraris and Maseratis) is now a staple of the restoration business but, although when new the machines looked gorgeous, it wasn’t necessary to dig too deep to find often shoddy standards of finish, the practice at the time something like sweeping the dirt “under the rug”.  When "restored", many of these cars are re-built to a higher standard, what was often left rough because it sat unseen somewhere now smoothed to perfection.  That’s what some customers want and the best restoration shops can do either though there are questions about whether what might be described as “fake patina” is quite the done thing.  Mechanics and engineers who were part of building Ferraris in the 1960s, upon looking at some immaculately “restored” cars have been known wryly to remark: that wasn't how we built them then.” 

Gucci offered Distressed Tights at US$190 (for a pair so quite good value).  Rapidly, they sold-out.

The fake patina business however goes back quite a way.  Among antique dealers, it’s now a definite niche but from the point at which the industrial revolution began to create a new moneyed class of mine and factory owners, there was a subset of the new money (and there are cynics who suggest it was mostly at the prodding of their wives) who wished to seem more like old money and a trend began to seek out “aged” furniture with which a man might deck out his (newly acquired) house to look as if things had been in the family for generations.  The notoriously snobbish (and amusing) diarist Alan Clark (1928–1999) once referred to someone as looking like “they had to buy their own chairs”, prompting one aristocrat to respond: “That’s a bit much from someone whose father (the art historian and life peer Kenneth Clark (1903–1983)) had to buy his own castle.  The old money were of course snooty about the such folk and David Lloyd George (1863–1945; UK prime-minister 1916-1922) would lament many of the “jumped-up grocers” in his Liberal Party were more troublesome and less sympathetic to the troubles of the downtrodden than the "backwoodsmen" gentry in their inherited country houses.

Wednesday, May 14, 2025

Psychache

Psychache (pronounced sahyk-eyk)

Psychological pain, especially when it becomes unbearable, producing suicidal thoughts.

1993: The construct was psyche- + ache.  Psychache was coined by US clinical psychologist Dr Edwin Shneidman (1918-2009) and first appeared in his book Suicide as Psychache: A Clinical Approach to Self-Destructive Behavior (1993).  The prefix psych- was an alternative form of psycho-.  Psycho was from the Ancient Greek ψχο- (psūkho-), a combining form of ψυχή (psukh) (soul).  Wit was used with words relating to the soul, the mind, or to psychology.  Ache was from the Middle English verb aken & noun ache (noun), from the Old English verb acan (from the Proto-West Germanic akan, from the Proto-Germanic akaną (to ache)) and the noun æċe (from the Proto-West Germanic aki, from the Proto-Germanic akiz), both from the primitive Indo-European heg- (sin, crime).  It was cognate with the Saterland Frisian eeke & ääke (to ache, fester), the Low German aken, achen & äken (to hurt, ache), the German Low German Eek (inflammation), the North Frisian akelig & æklig (terrible, miserable, sharp, intense), the West Frisian aaklik (nasty, horrible, dismal, dreary) and the Dutch akelig (nasty, horrible).  Historically the verb was spelled ake, and the noun ache but the spellings became aligned after Dr Johnson (Samuel Johnson (1709-1784)) published A Dictionary of the English Language (1755), the lexicographer mistakenly assuming it was from the Ancient Greek χος (ákhos) (pain) due to the similarity in form and meaning of the two words.  As a noun, ache meant “a continuous, dull pain (as opposed to a sharp, sudden, or episodic pain) while the verb was used to mean (1) to have or suffer a continuous, dull pain, (2) to feel great sympathy or pity and (3) to yearn or long for someone or something.  Pyscheache is a noun

Psychache is a theoretical construct used by clinical suicidologists and differs from psychomachia (conflict of the soul).  Psychomachia was from the Late Latin psӯchomachia, the title of a poem of a thousand-odd lines (circa 400) by Roman Christian poet Prudentius (Aurelius Prudentius Clemens; 348-circa 412), the construct being the Ancient Greek Greek psukhē (spirit) + makhē (battle).  The fifth century poem Psychomachia (translated usually as “Battle of Spirits” or “Soul War”) explored a theme familiar in Christianity: the eternal battle between virtue & vice (onto which can be mapped “right & wrong”, “good & evil” etc) and culminated in the forces of Christendom vanquishing pagan idolatry to the cheers of a thousand Christian martyrs.  An elegant telling of an allegory familiar in early Christian literature and art, Prudentius made clear the battle was one which happened in the soul of all people and thus one which all needed to wage, the outcome determined by whether the good or evil in them proved stronger.  The poem’s characters include Faith, Hope, Industry, Sobriety, Chastity, Humility & Patience among the good and Pride, Wrath, Paganism, Avarice, Discord, Lust & Indulgence in the ranks of the evil but scholars of literature caution that although the personifications all are women, in Latin, words for abstract concepts use the feminine grammatical gender and there’s nothing to suggest the poet intended us to read this as a tale of bolshie women slugging it out.  Of interest too is the appearance of the number seven, so familiar in the literature and art of Antiquity and the Medieval period as well as the Biblical texts but although Prudentius has seven virtues defeat seven vices, the characters don’t exactly align with either the canonical seven deadly sins, nor the three theological and four cardinal virtues.  In modern use, the linguistic similarity between psychache and psychomachia has made the latter attractive to those seduced by the (not always Germanic) tradition of the “romance of suicide”.

A pioneer in the field of suicidology, Dr Shneidman’s publication record was indicative of his specialization.

Dr Edwin Shneidman (1918-2009) was a clinical psychologist who practiced as a thanatologist (a practitioner in the field of thanatology (the scientific study of death and the practices associated with it, including the study of the needs of the terminally ill and their families); the construct of thanatology being thanato- (from the Ancient Greek θάνατος (thánatos) (death)) + -logy.  The suffix -ology was formed from -o- (as an interconsonantal vowel) + -logy.  The origin in English of the -logy suffix lies with loanwords from the Ancient Greek, usually via Latin and French, where the suffix (-λογία) is an integral part of the word loaned (eg astrology from astrologia) since the sixteenth century.  French picked up -logie from the Latin -logia, from the Ancient Greek -λογία (-logía).  Within Greek, the suffix is an -ία (-ía) abstract from λόγος (lógos) (account, explanation, narrative), and that a verbal noun from λέγω (légō) (I say, speak, converse, tell a story).  In English the suffix became extraordinarily productive, used notably to form names of sciences or disciplines of study, analogous to the names traditionally borrowed from the Latin (eg astrology from astrologia; geology from geologia) and by the late eighteenth century, the practice (despite the disapproval of the pedants) extended to terms with no connection to Greek or Latin such as those building on French or German bases (eg insectology (1766) after the French insectologie; terminology (1801) after the German Terminologie).  Within a few decades of the intrusion of modern languages, combinations emerged using English terms (eg undergroundology (1820); hatology (1837)).  In this evolution, the development may be though similar to the latter-day proliferation of “-isms” (fascism; feminism etc).

Death and the College Student: A Collection of Brief Essays on Death and Suicide by Harvard Youth (1973) by Dr Edwin Shneidman.  Dr Shneidman wrote many papers about the prevalence of suicide among college-age males, a cross-cultural phenomenon.

Dr Shneidman was one of the seminal figures in the discipline of suicidology, in 1968 founding the AAS (American Association of Suicidology) and the principal US journal for suicide studies: Suicide and Life-Threatening Behavior.  The abbreviation AAS is in this context used mostly within the discipline because (1) it is a specialized field and (2) there are literally dozens of uses of “AAS”.  In Suicide as Psychache: A Clinical Approach to Self-Destructive Behavior (1993) he defined psychache as “intense psychological pain—encompassing hurt, anguish, and mental torment”, identifying it as the primary motivation behind suicide, his theory being that when psychological pain becomes unbearable, individuals may perceive suicide as their only escape from torment.

Although since Suicide as Psychache: A Clinical Approach to Self-Destructive Behavior appeared in 1993 there have been four editions of American Psychiatric Association's (APA) Diagnostic and Statistical Manual of Mental Disorders (DSM), “psychache” has never appeared in the DSM.  That may seem an anomaly given much in the DSM revolves around psychological disturbances but the reason is technical.  What the DSM does is list and codify diagnosable mental disorders (depression, schizophrenia, bipolar disorder etc), classifying symptoms and behaviors into standardized categories for diagnosis and treatment planning.  By contrast, psychache is not a clinical diagnosis; it is a theoretical construct in suicidology which is used to explain the subjective experience of psychological pain that can lead to patients taking their own lives.  It thus describes an emotional state rather than a psychiatric disorder.

Lindsay Lohan and her lawyer in court, Los Angeles, December, 2011.

Despite that, mental health clinicians do actively use the principles of psychache, notably in suicide risk assessment and prevention and models have been developed including a number of “psychache scales”, self-reporting tools used to generate a metric measuring the intensity of psychological pain (categorized with headings such as shame, guilt, despair et al).  The approaches do in detail differ but most follow Dr Shneidman’s terminology in that the critical threshold is the point at which the patient’s pain becomes unbearable or inescapable and the objective is either to increase tolerance for distress or reframe troublesome thoughts.  Ultimately, the purpose of tools is to improve suicide risk assessments and reduce suicide rates.

DSM-5 (2013).

Interestingly, Suicidal Behavior Disorder (SBD) was introduced in Section III of the DSM-5 (2013) under “Conditions for Further Study”.  Then, SBD chiefly was characterized by a self-initiated sequence of behaviors believed at the time of initiation to cause one’s own death and occurring in the last 24 months.  That of course sounds exact but the diagnostic criteria in the DSM are written like that and the purpose of inclusion in the fifth edition was to create a framework so systematically, empirical studies related to SBD could be reviewed so primary research themes and promising directions for future research could be identified.  Duly, over the following decade that framework was explored but the conclusion was reached there seemed to be little utility in the clinical utility of SBD as a device for predicting future suicide and that more research was needed to understand measurement of the diagnosis and its distinctiveness from related disorders and other self-harming behaviors.  The phase “more research is required” must be one of the most frequently heard among researchers.

In the usually manner in which the APA allowed the DSM to evolve, what the DSM-5s tentative inclusion of SBD did was attempt to capture suicidality as a diagnosis rather than a clinical feature requiring attention.  SBD was characterized by a suicide attempt within the last 24 months (Criterion A) and that was defined as “a self-initiated sequence of behaviors by an individual who, at the time of initiation, expected that the set of actions would lead to his or her own death”.  That sounds uncontroversial but what was significant was the act could meet the criteria for non-suicidal self-injury (ie self-injury with the intention to relieve negative feelings or cognitive state in order to achieve a positive mood state (Criterion B) and cannot be applied to suicidal ideation or preparatory acts (Criterion C).  Were the attempt to have occurred during a state of delirium or confusion or solely for political or religious objectives, then SBD is ruled out (Criteria D & E).  SBD (current) is given when the suicide attempt occurred within the last 12 months, and SBD (in early remission), when it has been 12-24 months since the last attempt.  It must be remembered that while a patient’s behavior(s) may overlap across a number of the DSM’s diagnosises, the AMA’s committees have, for didactic purposes, always preferred to “silo” the categories.

DSM-5-TR (2022).

When in 2022 the “text revision” of the DSM-5 (DSM-5-TR) was released, SBD was removed as a condition for further study in Section III and moved to “Other Conditions That May Be a Focus of Clinical Attention” in Section II. The conditions listed in this section are intended to draw to attention of clinicians to the presence and breadth of additional issues routinely encountered in clinical practice and provide a procedure for their systematic documentation.  According to the APA’s editorial committee, the rationale for the exclusion of SBD from the DSM-5-TR was based on concerns the proposed disorder did not meet the criteria for a mental disorder but instead constituted a behavior with diverse causes and while that distinction may escape most of us, within the internal logic of the history of the DSM, that’s wholly consistent.  At this time, despite many lobbying for the adoption of a diagnostic entity for suicidal behavior, the APA’s committees seem still more inclined to conceptualize suicidality as a symptom rather than a disorder and despite discussion in the field of suicidology about whether suicide and related concepts like psychache should be treated as stand-alone mental health issues, that’s a leap which will have to wait, at least until a DSM-6 is published.

How to and how not to: Informatie over Zorgvuldige Levensbeëindiging (Information about the Careful Ending of Life, 2008) by Stichting Wetenschappelijk Onderzoek naar Zorgvuldige Zelfdoding (The Foundation for Scientific Research into Careful Suicide) (left) and How Not to Kill Yourself: A Phenomenology of Suicide (2023) by Clancy Martin (right).

Informatie over Zorgvuldige Levensbeëindiging (Information about the Careful Ending of Life, 2008) was published by a group of Dutch physicians & and researchers; it contained detailed advice on methods of suicide available to the general public, the Foundation for Scientific Research into Careful Suicide arging “a requirement exists within society for responsible information about an independent and dignified ending of life.”  It could be ordered only from the foundation’s website and had the advantage that whatever might be one’s opinion on the matter, it was at least written by physicians and scientists and thus more reliable than some of the “suicide guides” which are sometimes found on-line.  At the time research by the foundation had found that despite legislation in the Netherlands which permit doctors (acting within specific legal limits) to assist patient commit suicide, there were apparently several thousand cases each year of what it termed “autoeuthanasia” in which no medical staff directly were involved.  Most of these cases involved elderly or chronically ill patients who refused food and fluids and it was estimated these deaths happened at about twice the rate of those carried out under the euthanasia laws.  Since then the Dutch laws have been extended to included those who have no serious physical disease or are suffering great pain; there are people who simply no longer wish to live, something like the tragic figure in Blue Öyster Cult’s (Don't Fear) The Reaper (1976) © Donald Roeser (b 1947):

Came the last night of sadness
And it was clear she couldn't go on
Then the door was open and the wind appeared
The candles blew then disappeared
The curtains flew then he appeared
Saying don't be afraid

There is a diverse literature on various aspects of suicide (tips and techniques, theological & philosophical interpretations, cross-cultural attitudes, history of its treatment in church & secular law etc) and some are quite personal, written variously by those who later would kill themselves or those who contemplated or attempted to take their own lives.  In How Not to Kill Yourself: A Phenomenology of Suicide (2023) by Canadian philosopher Clancy Martin (b 1967), it was revealed the most recent of his ten suicide attempts was “…in his basement with a dog leash, the consequences of which he concealed from his wife, family, co-workers, and students, slipping back into his daily life with a hoarse voice, a raw neck and series of vague explanations.

BKA (the Bundeskriminalamt, the Federal Criminal Police Office of the FRG (Federal Republic of Germany (the old West Germany)) mug shots of the Red Army Faction's Ulrike Meinhof (left) and Gudrun Ensslin (right).

The song (Don't Fear) The Reaper also made mention of William Shakespeare's (1564–1616) Romeo and Juliet (1597) and in taking her own life (using her dead lover’s dagger) because she doesn’t want to go on living without him, Juliette joined the pantheon of figures who have made the tragedy of suicide seem, to some, romantic.  Politically too, suicide can grant the sort of status dying of old age doesn’t confer, the deaths of left-wing terrorists Ulrike Meinhof (1934–1976) and Gudrun Ensslin (1940–1977) of the West German Red Army Faction (the RAF and better known as the “Baader-Meinhof gang”) both recorded as “suicide in custody” although the circumstances were murky.  In an indication of the way moral relativities aligned during the high Cold War, the French intellectuals Jean-Paul Sartre (1905–1980) and Simone de Beauvoir (1908–1986) compared their deaths to the worst crimes of the Nazis but sympathy for violence committed for an “approved” cause was not the exclusive preserve of the left.  In July, 1964, in his speech accepting the Republican nomination for that year’s US presidential election, proto-MAGA Barry Goldwater (1909–1998) concluded by saying: “I would remind you that extremism in the defense of liberty is no vice!  And let me remind you also that moderation in the pursuit of justice is no virtue!  The audience response to that was rapturous although a few months later the country mostly didn’t share the enthusiasm, Lyndon Johnson (LBJ, 1908–1973; US president 1963-1969) winning the presidency in one of the greatest landslides in US electoral history.  Given the choice between crooked old Lyndon and crazy old Barry, Americans preferred the crook.

Nor was it just politicians and intellectuals who could resist the appeal of politics being taken to its logical “other means” conclusion, the Canadian singer-songwriter Leonard Cohen (1934-2016) during the last years of the Cold War writing First We Take Manhattan (1986), the lyrics of which were open to interpretation but clarified in 1988 by the author who explained: “I think it means exactly what it says.  It is a terrorist song.  I think it's a response to terrorism.  There's something about terrorism that I've always admired.  The fact that there are no alibis or no compromises.  That position is always very attractive.   Even in 1988 it was a controversial comment because by then not many outside of undergraduate anarchist societies were still romanticizing terrorists but in fairness to the singer the coda isn’t as often published: “I don't like it when it's manifested on the physical plane – I don't really enjoy the terrorist activities – but Psychic Terrorism.

First We Take Manhattan (1986) by Leonard Cohen

They sentenced me to twenty years of boredom
For tryin' to change the system from within
I'm coming now, I'm coming to reward them
First we take Manhattan, then we take Berlin
 
I'm guided by a signal in the heavens
I'm guided by this birthmark on my skin
I'm guided by the beauty of our weapons
First we take Manhattan, then we take Berlin
 
I'd really like to live beside you, baby
I love your body and your spirit and your clothes
But you see that line there moving through the station?
I told you, I told you, told you, I was one of those
 
Ah you loved me as a loser, but now you're worried that I just might win
You know the way to stop me, but you don't have the discipline
How many nights I prayed for this, to let my work begin
First we take Manhattan, then we take Berlin
 
I don't like your fashion business, mister
And I don't like these drugs that keep you thin
I don't like what happened to my sister
First we take Manhattan, then we take Berlin
 
I'd really like to live beside you, baby
I love your body and your spirit and your clothes
But you see that line there moving through the station?
I told you, I told you, told you, I was one of those



First We Take Manhattan performed by Jennifer Warnes (b 1947), from the Album Famous Blue Raincoat (1986). 

Whatever they achieved in life, it was their suicides which lent a lingering allure to German-American ecofeminist activist Petra Kelly (1947–1992) & the doomed poet American poet Sylvia Path (1932-1963) and the lure goes back for millennia, the Roman Poet Ovid (Publius Ovidius Naso; 43 BC–17 AD) in his Metamorphoses telling an ancient Babylonian tale in which Pyramus, in dark despair, killed herself after finding her young love lifeless.  Over the centuries it’s been a recurrent trope but the most novel take was the symbolic, mystical death in Richard Wagner's (1813–1883) Tristan und Isolde (1865).  Mortally wounded in a duel before the final act, Tristan longs to see Isolde one last time but just as she arrives at his side, he dies in her arms.  Overwhelmed by love and grief, Isolde sings the famous Liebestod (Love-Death) and dies, the transcendent aria interpreted as the swansong which carries her to join Tristan in mystical union in the afterlife.  This, lawyers would call a “constructive suicide”.

Austrian soprano Helga Dernesch (b 1939) in 1972 performing the Liebestod aria from Wagner’s Tristan und Isolde with the Berlin Philharmonic under Herbert von Karajan (1908–1989).

While she didn’t possess the sheer power of the greatest of the Scandinavian sopranos who in the mid-twentieth century defined the role, Dernesch brought passion and intensity to her roles and while, on that night in 1972, the lushness of what Karajan summoned from the strings was perhaps a little much, her Liebestod was spine-tingling and by then, Karajan had been forgiven for everything.  Intriguingly, although Tristan und Isolde is regarded as one of the great monuments to love, in 1854 Wagner had written to the Hungarian composer Franz Liszt (1811–1886) telling him:

As I have never in life felt the real bliss of love, I must erect a monument to the most beautiful of all my dreams, in which, from beginning to end, that love shall be thoroughly satiated.  I have in my head ‘Tristan and Isolde’, the simplest but most full-blooded musical concepion; with the ‘black flag’ which floats at the end of it I shall cover myself to die.

It’s not known whether Listz reflected on this apparent compositional self-medication for psychache after in 1870 learning from his morning newspaper his daughter Cosima (1837-1930) was to be married to Wagner (then 24 years her senior) but because she’d been for some seven years conducting an adulterous affair with the German the news may not have been unexpected.  He was aware Cosmia’s daughter (Isolde Beidler (1865–1919)) had been fathered not by her then husband (the German conductor Hans von Bülow (1830–1894)) but by Wagner and her second marriage proved happier than the first so there was that.