Showing posts sorted by date for query Disappear. Sort by relevance Show all posts
Showing posts sorted by date for query Disappear. Sort by relevance Show all posts

Tuesday, December 9, 2025

Customer

Customer (pronounced kuhs-tuhm-ah)

(1) A habitual patron, regular purchaser, returning client; one who has a custom of buying from a particular business (obsolete in its technical sense).

(2) A patron, a client; one who purchases or receives a product or service from a business or merchant, or intends to do so.

(3) In various slang forms (cool customer, tough customer, ugly customer, customer from hell, dream customer etc), a person, especially one engaging in some sort of interaction with others.

(4) Under the Raj, a native official who exacted customs duties (historic use from British colonial India).

Late 1300s: From the Middle English customere & custommere (one who purchases goods or supplies, one who customarily buys from the same tradesman or guild), from custumer (customs official, toll-gatherer), from the Anglo-French custumer, from the Old French coustumier & costumier (from which modern French gained coutumier (customary, custumal)), from the Medieval Latin noun custumarius (a toll-gatherer, tax-collector), a back-formation from the adjective custumarius (pertaining to custom or customs) from custuma (custom, tax).  The literal translation of the Medieval Latin custumarius was “pertaining to a custom or customs”, a contraction of the Latin consuetudinarius, from consuetudo (habit, usage, practice, tradition).  The generalized sens of “a person with whom one has dealings” emerged in the 1540s while that of “a person to deal with” (then as now usually with some defining adjective: “tough customer”, difficult customer” etc) was in use by the 1580s.  Derived terms are common including customer account, customer base, customer care, customer experience, customer-oriented, customer research, customer resistance, customer service, customer success, customer support, direct-to-customer, customer layer, customer-to-customer, ugly customer, tough customer, difficult customer etc.  Customer is a noun; the noun plural is customers.

William Shakespeare (1564–1616) used the word sometimes to mean “prostitute” and in his work was the clear implication that a buyer was as guilty as the seller, the law both unjust and hypocritical, something which in the twentieth century would be rectified in Swedish legislation.

Shakespeare: All's Well That Ends Well (circa 1602), Act 5, scene 3

LAFEW:  This woman’s an easy glove, my lord; she goes off and on at pleasure.

KING: This ring was mine. I gave it his first wife.

DIANA: It might be yours or hers for aught I know.

KING (to attendants) Take her away. I do not like her now.  To prison with her, and away with him. Unless thou tell’st me where thou hadst this ring, Thou diest within this hour.

DIANA: I’ll never tell you.

KING: Take her away.

DIANA: I’ll put in bail, my liege.

KING: I think thee now some common customer.

DIANA (to Bertram): By Jove, if ever I knew man, ’twas you.

In Sweden, the law was amended in a way of which Shakespeare might have approved, Chapter 6, Section 11 of the Swedish Penal Code making it an offence to pay for sex, the act of “purchasing sexual services” criminalized, the aim being to reduce the demand for prostitution.  The law provides for fines or a maximum term of imprisonment for one year, depending on the circumstances of the case.  So selling sexual services is not unlawful in Sweden but being a customer is, an inversion of the model for centuries applied in the West.  Individuals who engage in prostitution are not criminalized under Swedish law, which is intended to protect sex workers from legal penalties while targeting the customers, now defined as those who “exploit them”.  The Swedish model aims to reduce prostitution by focusing on the demand side and providing support for those who wish to exit prostitution and as a statement of public policy, the law reform reflected the government’s view prostitution was a form of gender inequality and exploitation.  The effectiveness of the measure has over the years been debated and the customer-focused model of enforcement has not widely been emulated.

The customer is always right

Reliable return customer: Lindsay Lohan in the Chanel Shop, New York City, May 2013.

The much quoted phrase (which in some areas of commerce is treated as a proverb): “the customer is always right” has its origins in retail commerce and is used to encapsulate the value: “service staff should give high priority to customer satisfaction”.  It is of course not always literally true, the point being that even when patently wrong about something, it is the customer who is paying for stuff so they should always be treated as if they are right.  Money being the planet’s true lingua franca, variations exist in many languages, the best known of which is the French le client n'a jamais tort (the customer is never wrong), the slogan of Swiss hotelier César Ritz (1850-1918) whose name lived on in the Hôtel Ritz in Paris, the Ritz and Carlton Hotels in London and the Ritz-Carlton properties dotted around the world.  While not always helpful for staff on the shop floor, it’s an indispensible tool for those basing product manufacturing or distribution decisions on aggregate demand.  To these counters of beans, what is means is that if there is great demand for red widgets and very little for yellow widgets, the solution probably is not to commission an advertising campaign for yellow widgets but to increase production of the red, while reducing or even ceasing runs of the yellow.  The customer is “right” in what they want, not in the sense of “right & wrong” but in the sense of their demand being the way to work out what is the “right” thing to produce because it will sell.

Available at Gullwing Motor Cars: Your choice at US$129,500 apiece.

The notion of “the customer is always right” manifests in the market for pre-modern Ferraris (a pre-1974 introduction the accepted cut-off).  While there nothing unusual about differential demand in just about any market sector, dramatically is it illustrated among pre-modern Ferraris with some models commanding prices in multiples of others which may be rarer, faster, better credentialed or have a notionally more inviting specification.  That can happen when two different models are of much the same age and in similar condition but a recent listing by New York-based Gullwing Motor Cars juxtaposed two listings which left no doubt where demand exists.  The two were both from 1972: a 365 GTC/4 and a Dino 246 GT.

Some reconditioning required: 1972 Ferrari 356 GTC/4

The 365 GTC/4 was produced for two years between 1971-1972 during which 505 were built.  Although now regarded as a classic of the era, the 365 GTC/4 lives still in the shadow of the illustrious 365 GTB/4 with which, mechanically, it shares much.  The GTB/4 picked up the nickname “Daytona”, an opportunistic association given 1-2-3 finish in the 1967 24 Hours of Daytona involved three entirely different models while the GTC/4 enjoyed only the less complementary recognition of being labeled by some il gobbone (the hunchback) or quello alla banana (the banana one).  It was an unfair slight and under the anyway elegant skin, the GTB/4 & GTC/4 shared much, the engine of the latter differing mainly in lacking the dry-sump lubrication, the use of six twin-choke side-draft Weber carburetors rather than the downdrafts, this permitting a lower hood (bonnet) line and a conventionally mounted gearbox rather than the the Daytona's rear transaxle.  Revisions to the cylinder heads allowed the V12 to be tuned to deliver torque across a broad rev-range rather than the focus on top-end power which was one of the things which made the Daytona so intoxicating.

Criticizing the GTC/4 because it doesn’t quite have the visceral appeal of the GTB/4 seems rather like casually dismissing the model who managed only to be runner-up to Miss Universe.  The two cars anyway, despite sharing a platform, were intended for different purposes, the GTB/4 an outright high performance road car which could, with relatively few modifications, be competitive in racing whereas the GTC/4 was a grand tourer, even offering occasional rear seating for two (short) people.  One footnote in the history of the marque is the GTC/4 was the last Ferrari offered with the lovely Borrani triple-laced wire wheels; some GTB/4s had them fitted by the factory and a few more were added by dealers but the factory advised that with increasing weight, tyres with much superior grip and higher speeds, they were no longer strong enough in extreme conditions and the cast aluminum units should be used if the car was to be run in environments without speed restrictions such as race tracks or certain de-restricted public roads (then seen mostly in the FRG (Bundesrepublik Deutschland (Federal Republic of Germany; the old West Germany) 1949-1990), Montana & Nevada in the US and Australia's Northern Territory & outback New South Wales (NSW)).  The still stunning GTB/4 was the evolutionary apex of its species; it can't be improved upon but the GTC/4 is no ugly sister and when contemplating quello alla banana, one might reflect on the sexiness of the fruit.

Gullwing’s offering was described as “a highly original unrestored example in Marrone Colorado (Metallic Brown) with a tan leather interior, factory air conditioning, and power windows; showing 48K miles (77K kilometres) on the odometer.  It has been sitting off the road for several years and is not currently running. It was certainly highly original and seemed complete but properly should be regarded as a “project” because of the uncertainty about the extent (and thus the cost) of the recommissioning.  At an asking price of US$129,500, it would represent good value only if it was mechanically sound and no unpleasant surprises were found under the body’s shapely curves although, given the market for 365 GTC/4s in good condition, it was a project best taken on by a specialist.

Some assembly required: 1972 Dino 246 GT by Ferrari

The days are gone when the Dino 246 was dismissed as “more of a Fiat than a Ferrari” and even if the factory never put their badge on the things (although plenty subsequently have added one), they are now an accepted part of the range.  The 246 replaced the visually almost similar but slightly smaller and even more jewel-like Dino 206, 152 of which (with an all-aluminium 2.0 litre (122 cubic inch) V6 rather than the V12s which had for some years been de rigueur in Ferrari’s road cars) were built between 1967-1969, all with berlinetta (coupé) bodywork.  Mass-produced by comparison, there were 3569 Dino 246s produced between 1969-1974, split between 2,295 246 GTs (coupés) & 1,274 246 GTSs (spyders (targa)).  Fitted with an iron-block 2.4 litre (147 cubic inch) V6, the Dinos were designed deliberately to be cheaper to produce and thus enjoy a wider market appeal, the target those who bought the more expensive Porsche 911s, a car the Dino (mostly) out-performed.  In recent decades, the Dino 246 has been a stellar performer in the collector market, selling typically for three times the price of something like a 365 GTC/4; people drawn to the seductive lines rather than the significantly better fuel consumption.

Most coveted of the 246s are those described with the rhyming colloquialism “chairs and flares” (C&F to the Ferrari cognoscenti), a reference to a pair of (separately available) options available on later production Dino 246s.  The options were (1) seats with inserts (sometimes in a contrasting color) in the style used on the Daytona & (2) wider Campagnolo Elektron wheels (which the factory only ever referred to by size) which necessitated flared wheel-arches.  At a combined US$795.00 (in 1974), the C&F combination has proved a good investment, now adding significantly to the price of the anyway highly collectable Dino.  Although it's hard to estimate the added value because so many other factors influence calculation, all else being equal, the premium is usually between US$100-200,000 but these things are always relative; in 1974 the C&F option added 5.2% to a Dino GTS's list price and was just under a third the cost of a new small car such as the Chevrolet Vega.  It was a C&F Dino 246 GTS which in 1978 was found buried in a Los Angeles where it had sat for some four years after being secreted away in what turned out to be an unplanned twist to a piece of insurance fraud.  In remarkably good condition (something attributed to its incarceration being during one of California’s many long droughts), it was fully restored.

Not in such good condition is the post-incineration Dino 246 GT (not a C&F) being offered by Gullwing Motor Cars, the asking price the same US$129,500 as the 365 GTC/4.  Also built in 1972, Gullwing helpfully describe this as “project”, probably one of history’s less necessary announcements.  The company couldn’t resist running the title “Too Hot to Handle” and described the remains as “…an original car that has been completely burnt.  Originally born in Marrone Colorado with beige leather.  It comes with its clear matching title and this car clearly needs complete restoration, but the good news is that it's certainly the cheapest one you will ever find.  The Dino market is hot and shows no signs of cooling. An exciting opportunity to own an iconic 246GT Dino. This deal is on fire!  It’s still (technically) metal and boasts the prized “matching numbers” (ie the body, engine & gearbox are all stamped with the serial numbers which match the factory records) so there’s that but whether, even at the stratospheric prices Dinos often achieve, the economics of a restoration (that may be the wrong word) can be rationalized would need to be calculated by experts.  As with the 365 GTC/4, Gullwing may be amenable to offers but rather that the customer always being right, this one needs "the right customer".

Aggregate demand: The highly regarded auction site Bring-a-Trailer (BAT, their origin being a clearing house for “projects” although most were less challenging than Gullwing’s Dino) publishes auction results (including “reserve not met” no-sales) and the outcomes demonstrate how much the market lusts for Dinos.  BAT also has a lively comments section for each auction and more than once a thread had evolved to discuss the seeming incongruity of the prices achieved by Dinos compared with the rarer Berlinetta Boxer (365 GT4 BB, BB 512 & BB 512i) (1973-1984) which was when new much more expensive, faster and, of course, a genuine twelve cylinder Ferrari.  In such markets however, objective breakdowns of specifications and specific performance are not what decide outcomes: The customer is always right.

Digging up: The famous "buried" 1974 Dino 246 GTS, being extracted, Los Angeles, 1978 (left) and the body tag of a (never buried) 1974 Dino 246 GTS.  While it's true the factory never put a "Ferrari badge" on the Dino 206 & 246 (nor did one appear on the early Dino 308s) the Ferrari name does appear on the tags and some parts.  Gullwing's Dino would be a more challenging "project" and even with today's inflated values, the financial viability of a restoration might be dubious. 

Although it's in recent years the prices paid for the things sharply have spiked, the lure of the Dino is not a recent thing.  In 1978, a 1974 246 GTS was discovered buried in a Los Angeles yard and it transpired it was on the LAPD’s (Los Angeles Police Department) long list of stolen vehicles.  The department’s investigators concluded the burial had been a “rush job” because while it had been covered with carpets and some plastic sheeting in an inexpert attempt to preserve it from the sub-terrain, one window had been left slightly open.  Predictably, the back-story was assumed to be an “insurance scam”, the owner allegedly hiring two “contractors” to “make it disappear” in a manner consistent with car theft, hardly an unusual phenomenon in Los Angeles.  The plan was claimed to be for the Dino to be broken up with all non-traceable (ie not with serial numbers able to be linked to a specific vehicle) parts on-sold with whatever remained to be dumped “somewhere off the coast”.  In theory, the scamming owner would bank his check (cheque) from Farmers Insurance while the “contractors” would keep their “fee for service” plus whatever profits they realized from their “parting-out” which, even at the discount which applies to “fenced” stolen goods, would have been in the thousands; a win-win situation, except for the insurance company and, ultimately, everyone who pays premiums.

Dug Up: The 'buried" Dino after restoration.  Two of the Campagnolo wheels are said to be original and the 14 x 7½ wheels & fender flares combo was at the time a US$680.00 (about a third the cost of a new, small car); their presence can now add US$100,000 to a 246's value so they proved a reasonable investment.

However, it’s said that when driving the Dino, the hired pair found it so seductive they decided to keep it, needing only somewhere to conceal it until they could concoct another plan.  Thus the hasty burial but for whatever reason (the tales differ), they never returned to reclaim the loot and four years later the shallow automotive grave was uncovered after a “tip-off” from a “snitch” (tales of children finding it while “playing in the dirt” an urban myth.  The matter of insurance fraud was of course pursued but no charges were laid because police could not discover who had done the burial and rather than being scraped and “parted-out” (this time lawfully) as might have been expected, the Dino was sold and restored.  That was possible because it was in surprisingly good condition after its four years in a pit, something accounted for by (1) the low moisture content of the soil, (2) the degree of protection afforded by the covers placed at the time of burial and (3) its time underground coinciding with one of the prolonged droughts which afflict the area.  So, although Dino values were not then what they became, purchased at an attractive price (a reputed US$9000), it was in good enough shape for a restoration to be judged financially viable and it was “matching numbers” (#0786208454-#355468) although that had yet become a fetish.  The car remains active to this day, still with the Californian licence plate “DUG UP”.

Sandra West with her 1964 Ferrari 330 America.

Cars (for fraudulent purposes being buried or otherwise secreted away is a not uncommon practice (some have even contained a dead body or two) but there’s at least one documented case of an individual being, in accordance with a clause in their will, buried in their Ferrari.  Sandra West (née Hara, 1939-1977) became a Beverly Hills socialite after marrying Texas oil millionaire and securities trader Ike West (1934-1968) and as well as jewels and fur coats (then socially acceptable evening wear), she developed a fondness for Ferraris.  Her husband died “in murky circumstances” in a room of the Flamingo Hotel in Las Vegas and while the details of his demise at a youthful 33 seem never to have been published, he had a history of drug use and “health issues” related to his frequent and rapid fluctuations in weight.  His widow inherited some US$5 million (then a considerable fortune) so the LA gossip columnists adjusted their entries from “Mrs West” to “Sandra West, Beverly Hills Socialite and Heiress”.  Her widowed life seems not to have been untroubled and her death in 1977 was certainly drug-related although sources differ about whether it was an overdose of some sort or related to the injuries she’d suffered in an earlier car accident.

Sandra West's burial.  The legal proceedings related to the contested "burial clause" had been well publicized and the ceremony attracted a large crowd.

She left more than one will but a judge ultimately found one to be valid and it included a clause stating she must be buried “…in my lace nightgown … and in my Ferrari with the seat slanted comfortably.  Accordingly, after a two month delay caused by her brother contesting the “burial clause”, Mrs West’s appropriately attired body was prepared while the Ferrari was sent (under armed guard) by train to Texas where the two were united for their final journey.  Car and owner were then encased in a sturdy timber box measuring 3 metres (10 feet) x 2.7 m (9 feet) x 5.8 m (19 feet) which was transported by truck to San Antonio for the ceremony, conducted on 4 May 1977 in the Alamo Masonic Cemetery (chartered in 1848, the Ancient Free and Accepted Masons in 1854 purchased this property because of the need for a burial ground for Freemasons).  It was an unusual ceremony in that a crane was used carefully to lower the crate into an obviously large grave while to deter “body snatchers” (who would be interested in exhuming car rather than corpse), a Redi-mix truck was on-hand to entomb the box in a thick layer of concrete.  In a nice touch, her grave lies alongside that of her husband and has been on the itinerary of more than one tourist operator running sightseeing tours.  Mrs West owned three Ferraris and it’s not clear in which her body was laid; while most reports claim it was her blue, 1964 330 America (s/n 5055), some mention it as a 250 GTE but 330 America #5055 has not since re-appeared (pre-modern Ferraris carefully are tracked) so that is plausible and reputedly it was “her favourite”.  Inevitably (perhaps sniffing the whiff of a Masonic plot), conspiracy theorists have long pointed out the only documentary evidence is of “a large crate” being lowered into the grave with no proof of what was at the time within.  However, given burial clause was ordered enforceable by a court, it should be assumed that under the remarkably plain gravestone which gives no indication of the unusual event, rests a Ferrari of some tipo.

Saturday, November 29, 2025

Grammatology

Grammatology (pronounced gram-uh-tol-uh-jee)

(1) Historically, the scientific study of systems of writing.

(2) In latter-day use, a critique of orthodox linguistics.

Early 1800s (in its original sense): The construct was gramma(r) + -t- + -ology; the modern (some would say post-modern) re-purposing was first used in 1967.  Dating from the mid fourteenth century, grammar was from the Middle English gramery & gramere, from the Old French gramaire (classical learning), from the unattested Vulgar Latin grammāria, an alteration of the Classical Latin grammatica, from the Ancient Greek γραμματική (grammatik) (skilled in writing), from γράμμα (gramma) (line of writing), from γράφω (gráphō) (write), from the primitive Indo-European gerbh (to carve, to scratch).  It displaced the native Old English stæfcræft; a doublet of glamour, glamoury, gramarye & grimoire.  In English, grammar is used to describe the system of rules and principles for the structure of a language (or of languages in general) but in colloquial use it’s applied also to morpology (the internal structure of words) and syntax (the structure of phrases and sentences of a language).  In English, generative grammar (the body of rules producing all the sentences permissible in a given language, while excluding all those not permissible) has for centuries been shifting and it’s now something policed by the so-called “grammar Nazis”, some of whom insist on enforcing “rules” regarded by most as defunct as early as the nineteenth century.

The suffix -ology was formed from -o- (as an interconsonantal vowel) +‎ -logy.  The origin in English of the -logy suffix lies with loanwords from the Ancient Greek, usually via Latin and French, where the suffix (-λογία) is an integral part of the word loaned (eg astrology from astrologia) since the sixteenth century.  French picked up -logie from the Latin -logia, from the Ancient Greek -λογία (-logía).  Within Greek, the suffix is an -ία (-ía) abstract from λόγος (lógos) (account, explanation, narrative), and that a verbal noun from λέγω (légō) (I say, speak, converse, tell a story).  In English the suffix became extraordinarily productive, used notably to form names of sciences or disciplines of study, analogous to the names traditionally borrowed from the Latin (eg astrology from astrologia; geology from geologia) and by the late eighteenth century, the practice (despite the disapproval of the pedants) extended to terms with no connection to Greek or Latin such as those building on French or German bases (eg insectology (1766) after the French insectologie; terminology (1801) after the German Terminologie).  Within a few decades of the intrusion of modern languages, combinations emerged using English terms (eg undergroundology (1820); hatology (1837)).  In this evolution, the development may be though similar to the latter-day proliferation of “-isms” (fascism; feminism et al).  Grammatology & grammatologist are nous, grammatological is an adjective and grammatologically is an adverb; the noun plural is grammatologies.

Google ngram (a quantitative and not qualitative measure): Because of the way Google harvests data for their ngrams, they’re not literally a tracking of the use of a word in society but can be usefully indicative of certain trends, (although one is never quite sure which trend(s)), especially over decades.  As a record of actual aggregate use, ngrams are not wholly reliable because: (1) the sub-set of texts Google uses is slanted towards the scientific & academic and (2) the technical limitations imposed by the use of OCR (optical character recognition) when handling older texts of sometime dubious legibility (a process AI should improve).  Where numbers bounce around, this may reflect either: (1) peaks and troughs in use for some reason or (2) some quirk in the data harvested.

Grammatology in its re-purposed sense was from the French grammatologie, introduced to the world by French philosopher Jacques Derrida (1930-2004) in his book De la grammatologie (Of Grammatology (1967)).  It may be unfair to treat Derrida’s use as a “re-purposing” because although the word grammatology (literally “the study of writing”) had existed since the early nineteenth century, it was a neologism, one of an expanding class of “-ology” words (some of them coined merely for ironic or humorous effect) and there was prior to 1967 scant evidence of use, those studying languages, literature or linguistics able satisfactorily to undertake their work without much needing “grammatology”.  On the basis of the documents thus far digitized, “grammatology” was never an accepted or even commonly used term in academia and although it seems occasionally to have been used variously in fields related to “the study of writing systems” (apparently as a synonym for paleography, epigraphy, writing-system classification or orthographic description) it was only in passing.  Until the modern era, words “going viral” happened relatively infrequently and certainly slowly and, as used prior to 1967, “grammatology” was attached to no theoretical construct or school of thought and described no defined discipline, the word indicative, empirical and neutral.  If “pre-modern” grammatology could be summed up (a probably dubious exercise), it would be thought a technical term for those concerned with scripts, alphabets, symbols and the historical development of writing systems.  Tempting though it may seem, it cannot be thought of as proto-structuralism.

The novelty Derrida introduced was to argue the need for a discipline examining the history, structure and philosophical implications of writing, his particular contention that writing is not secondary to speech, a notion at odds with centuries of Western metaphysics.  At the time, it was seen as a radical departure from orthodoxy, Derrida exploring (in the broadest imaginable way), the possibilities of writing, not simply the familiar physical inscriptions, but anything that functions as “trace,” “differance,” or symbolic marking, the core argument being writing is not secondary to speech (although in the narrow technical sense it may be consequent); rather, it reveals the instability and “constructedness” of language and thereby meaning.

De la grammatologie (First edition, 1967) by Jacques Derrida.

Ambitiously, what Derrida embarked upon was to do to the study something like what Karl Marx (1818-1883) claimed to have done to the theories of Hegel (Georg Wilhelm Friedrich Hegel (1770-1831)): “turn things on their head”, a process that can be classified under four themes: (1) Writing as prior to speech (as opposed to the earlier “Writing is derivative of speech”).  What this meant was writing had to be considered as “originary”, implying structures of difference could precede both writing and speech. (2) Writing (the act as opposed to the content) as a philosophical concept rather than a finite collection of technical objects to be interpreted or catalogued on the basis of their form of assembly.  (3) Grammatology becomes a critique (as opposed to the earlier descriptive tool) of science, reimagining it as a critical discipline exposing the logocentrism of Western thought.  Logocentrism describes the tendency to prioritize “logos” (in academic use a word encompassing words, speech or reason), as the ultimate foundation for truth and meaning (with speech often privileged over writing).  Logocentrism was at the core of the Western philosophical tradition that assumed language accurately and directly can express an external reality, the companion notion being rational thought represents the highest form of knowledge.  Derrida labelled this a false hierarchy that devalued writing and other non-verbal forms of communication and feeling. (4) Writing is expanded beyond literal inscriptions.  Whereas the traditional Western view had been that writing was simply the use of an alphabet, cuneiform, hieroglyphs and such, what Derrida suggested was the concept of writing should be extended to any system of differences, traces, or marks; the condition for meaning itself.

So Derrida took grammatology from an dusty corner of the academy where it meant (for the small number of souls involved) something like “a hypothetical technical study of writing systems” and re-invented it as a philosophical discipline analysing the deeper structures that make any representation or meaning possible.  The notion of it as a tool of analysis is important because deconstruction, the word Derrida and other “celebrity philosophers” made famous (or infamous depending on one’s stance on things postmodern) is often misunderstood as something like “destruction” when really it is a form of analysis.  If Derrida’s subversive idea been presented thirty years earlier (had the author been able to find a publisher), it’s possible it would have been ignored or dismissed by relative few who then read such material.  However, in the post-war years there was an enormous expansion in both the number of universities and the cohorts of academics and students studying in fields which would come to be called “critical theory” so there was a receptive base for ideas overturning orthodoxy, thus the remarkable path deconstruction and postmodernism for decades tracked.

Deconstruction in art, Girl With Balloon by street artist Banksy, before, during & after a (successful) test deconstruction (left) and in its final form (right), London, October 2018.

There is an ephemeral art movement but usually it involves works which wholly are destroyed or entirely disappear.  Banksy’s Girl With Balloon belonged to a sub-category where (1) the deconstruction process was part of the art and (2) the residual elements were “the artwork”.  Banksy’s trick with this one was as the auctioneer’s hammer fell (at Stg£1m), an electric shredder concealed at the base of the frame was activated, the plan being to reduce the work “to shreds” in a pile below.  However, it’s claimed there was a technical glitch and the shredder stopped mid-shred, meaning half remained untouched and half, neatly sliced, hung from the bottom.  As a headline grabbing stunt it worked well but the alleged glitch worked better still, art experts mostly in agreement the work as “half shredded” was more valuable than had it been “wholly shredded” and certainly more than had it remained untouched in the frame.  Thus: “meaning is just another construct which emerges only through differences and deferrals”.

From a distance of sixty-odd years, in the milieu of the strands of thought which are in a sense part of a “new orthodoxy”, it can be hard to understand just what an impact Derrida and his fellow travellers (and, just as significantly, his critics) had and what an extraordinary contribution deconstruction made to the development in thought of so many fields.  Derrida in 1967 of course did not anticipate the revolutionary movement he was about to trigger, hinted at by his book starting life as a doctoral thesis entitled: De la grammatologie: Essai sur la permanence de concepts platonicien, aristotélicien et scolastique de signe écrit. (Of Grammatology: Essay on the Permanence of Platonic, Aristotelian and Scholastic Concepts of the Written Sign).  A typically indigestible title of the type beloved by academics, the clipping for wider distribution was on the same basis as Adolf Hitler’s (1889-1945; Führer (leader) and German head of government 1933-1945 & head of state 1934-1945) publisher deciding Mein Kampf (My Struggle) was snappier than Viereinhalb Jahre (des Kampfes) gegen Lüge, Dummheit und Feigheit (Four and a Half Years [of Struggle] Against Lies, Stupidity and Cowardice).  There’s a reasons authors usually don’t have the final say on titles and cover art.

Derrida acknowledged linguistics in the twentieth century had become a sophisticated form of study but maintained the discipline was failing to examine its most fundamental assumptions; indeed his point was those core values couldn’t be re-evaluated because they provided the framework by which language was understood.  What Derrida indentified as the superstructure which supported all was the commitment to the primacy of speech and presence and because the prevailing position in linguistics was that speech was primary, the assumption worked to shape all that followed.  It was the influence of the Swiss philosopher & semiotician Ferdinand de Saussure (1857–1913) which was profound in positioning speech as the natural, original, living form of language with writing as a secondary, derivative (and, in a sense, artificial although this was never wholly convincing) representation of speech.  What made the Saussureian position seem compelling was it sounded logical, given the consensus it was human speech which predated the development of writing, the latter thus the product of the former and so persuasive was the thesis the hierarchy came to provide the framework for other disciplines within linguistics including phonology (the study of the way sounds function in languages) and morphology (the study of the internal structure of morphemes (the smallest linguistic unit within a word able to support a meaning)that can carry a meaning.  What this meant was syntax was also defined by speech (writing a mere convenient means of exchange) with phonetics (the study of the physical sounds of human speech) the true source of the material language.  Thus for generations, in academic discourse, historical linguistics were documented primarily by an analysis of changes in sound with orthography (the methods by which a language or its sounds are represented by written symbols); a mechanical by-product.

Deconstruction in fashion.  Lindsay Lohan in Theia gown, amfAR gala, New York City, February 2013 (left) and after “deconstruction by scissors” (right).

All gowns are “constructed” (some 3D printed or even “sprayed-on”) but sometimes circumstances demand they be “deconstructed”.  On the night, the shimmering nude and silver bugle-beaded fringe gown from Theia’s spring 2011 collection was much admired but there was an “unfortunate incident” (ie the fabric was torn) and, apparently using a pair of scissors, there was some ad-hoc seamstressery to transform the piece into something described as a “mullet minidress”.  That turned out to be controversial because the gown was on loan for the night but such things are just part of the cost of doing business and, with its Lohanic re-imagining, it’s now an artefact.

Derrida didn’t dispute the historic timelines; his point was that in defining linguistics based on this hierarchy, it became impossible to question the orthodoxy from within.  In a classic example of how deconstruction works, he argued the hierarchy was based not on the historical sequence of events (ie writing coming after speech) but was a culturally defined attachment to the idea of presence, voice and authentic meaning; with speech entrenched in its primacy, no discipline within linguistics was able fully to study writing because of this structural prejudice positioning writing as an auxiliary system, a mere notation of sounds encoding the pre-existing spoken language.  That didn’t mean writing couldn’t be studied (as for centuries it had been) but that it could be considered only a tool or artefact used to record speech and never a primary object of meaning.  While there were all sorts of reasons to be interested in writing, for the reductionists who needed to get to the essence of meaning, writing could only ever be thought something mechanistic and thus was philosophically uninteresting.  So, if linguistics was unable to analyse writing as (1) a structure independent of speech, (2) a fundamental element of thought processes, (3) a source of new or changed meanings or (4) a construct where cultural and philosophical assumptions are revealed, that would imply only speech could create meaning with writing a mere form of its expression.  Daringly thus, what Derrida demanded was for writing to be seen as conceptually prior to speech, even if as a physical phenomenon it came later.  In 1967, linguistics couldn’t do that while maintaining the very foundations on which it was built.

Never has there been published a "Grammatology for Dummies" but there is The Complete Idiot's Guide to Literary Theory and Criticism (2013) by Dr Steven J. Venturino.

At this point things became more technical but Derrida did provide a simplified model, explaining linguistics existed as the study of signs and not of traces, his work depending ultimately on certain distinctions: (1) Signs assume stable signifieds and (2) traces imply meaning is always deferred but never present.  For orthodox linguistics to work, the assumption had to be that signs enjoy a stability of meaning within a system; this Derrida dismissed as illusory arguing (1) meaning is just another construct which emerges only through differences and deferrals, (2) no signified is ever (or can ever fully be) “present” and (3) speech is no closer to meaning than writing.  By its own definitions in 1967, linguistics could not accommodate that because (1) its methods depended on systematic relations sufficiently stable to permit analysis, (2) it needed constant objects (definable units such as phonemes, morphemes and rules of syntax), syntactic structures) and (3) it relied on signs which could be described with the required consistency (ie “scientifically”).  Any approach grounding in trace and difference lay beyond the boundaries of orthodox linguistics.

So the conflict would seem irreconcilable but that’s true only if viewed through the lens of a particular method; really, linguistics was empirical and grammatology was philosophical and in that were alternative rather than competing or even parallel paths.  If linguistics was a system of codification, then grammatology was a critique of the foundations of linguistics and Derrida made clear he was not attempting to reform linguistics simply because that couldn’t be done; any attempt to interpolate his ideas into the discipline would have meant it ceased to be linguistics.  He wanted a new discipline, one which rather than empirically describing and categorising language and its elements, stood back and asked what in the first place made such systems possible.  That meant it was a transcendental rather than empirical process, one studying the conditions of representation and the metaphysics implicit in the idea of signification.  Writing thus was not merely marks on a surface but a marker of a difference in being.

The twist in the tale is that although De la grammatologie was highly influential (especially after an Edition appeared in English in 1976), grammatology never became a defined, institutionalised academic field in the way Derrida envisioned it at least supplementing departments of linguistics, anthropology and philosophy.  That was due less to the well-documented phenomenon of institutional inertia than it proving impossible for any consensus to be reached about what exactly “grammatological analysis” was or what constituted “grammatological research”.  Pleasingly, it was the structuralists who could account for that by explaining grammatology was a critique of the metaphysics underlying other disciplines rather than a method for generating new empirical knowledge.  Fields, they noted, were likely organically to grow as the tools produced were picked up by others to be applied to tasks; grammatology was a toolbox for dismantling tools.

Jacques Derrida with pipe, deconstructing some tobacco.

Even if Derrida’s concepts proved sometimes too vague even for academics the influence was profound and, whether as a reaction or something deterministic (advances in computer modelling, neurology and such), the discipline of linguistics became more rather than less scientific, the refinements in the field of generative grammar in particular seen as something of a “doubling down” of resistance to Derrida’s critique, something reflected too in anthropology which came even more to value fieldwork and political economy, philosophical critiques of writing thought less helpful.  So the specialists not only clung to their speciality but made it more specialized still.  Grammatology did however help create genuinely new movements in literary theory, the most celebrated (and subsequently derided) being deconstruction where Derrida’s ideas such as interpretation being an infinite play of differences and the meaning of texts being inherently unstable created one of the more radical schools of thought in the post-war West, introducing to study concepts such as paratext (how academics “read between and beyond the lines) the trace (the mark of something absent, a concept that disrupts the idea of pure presence and self-contained meaning) and marginalia (used here as an abstract extension of what an author may have “written in the margins” to encompass that which may seem secondary to the main point but is actually crucial to understanding the entire structure of thought, blurring the (literal) line between what lies inside and outside a text).

Derrida for Beginners (2007) by Jim Powell (illustrated by Van Howell).  On has to start somewhere.

The movement became embedded in many English and Comparative Literature departments as well as in post-structuralism and Continental philosophy.  Modern beasts like media studies & cultural theory are (in their understood form) unthinkable without deconstruction and if grammatology didn’t become “a thing”, its core elements (difference, trace etc) for decades flourished (sometimes to the point of (published) absurdity) and although not all agree, some do argue it was Derrida’s subversion in 1967 which saw the field of semiotics emerge to “plug the gaps” left by the rigidity of traditional linguistics.  Of course, even if grammatology proved something of a cul-de-sac, Derrida’s most famous fragment: “Il n'y a pas de hors-texte” (literally “there is no outside-text”) endured to underpin deconstruction and postmodernism generally.  Intriguingly for a concept from linguistics, the phrase took on a new life in the English-speaking world where it came to be understood as “everything is text”, an interpretation which created a minor publishing industry.  In English, it’s a marvellously literalist use and while it does to an extent overlap with the author’s original intention, Derrida meant there is (1) no access to pure, unmediated presence and (2) no meaning outside interpretation and no experience outside context.  In using texte he was referring to the interplay of differences, traces, relations, and contexts that make meaning possible (ie not literally the words as they appear on a page).  What that meant was all acts were “textual” in that they must be interpreted and are intelligible only within systems of meaning; the phrase a philosophical statement about signification and mediation, not characters printed on page.

Fiveable's diagram of what we need to know to understand literature.  Hope this helps.

However, demonstrating (in another way) the power of language, the “everything is text”) movement (“cult” may once have been a better word) in English came to be understood as meaning no reality could exist beyond language; everything (literally!) is text because it is words and discourse which both construct and describe reality.  That notion might have remained in an obscure .ivory tower were it not for the delicious implication that values such as right & wrong and true & false are also pieces of text with meanings able to be constructed and deconstructed.  That meant there was no stable “truth” and nothing objectively was “wrong”; everything just a construct determined by time, place and circumstances.  That Derrida never endorsed this shocking relativism was noted by some but academics and students found so intoxicating the notion of right & wrong being variables that “everything is text” took on a life of its own as a kind of selective nihilism which is, of course, quite postmodern.  Again, language was responsible because the French texte was from the Latin textus, from texō (weave) and while in French it can mean “text” (in the English sense), among philosophers it was used metaphorically to suggest “weave together”; “an interconnected structure” in the sense of the Latin textus (woven fabric); it was this meaning Derrida used.  Had the English-speaking world remained true to the original spirit of Il n'y a pas de hors-texte it would have entered the textbooks as something like “There is nothing outside the interplay of signs and contexts; There is no meaning outside systems of interpretation” and perhaps have been forgotten but “everything is text” defined and seduced a movement.  Thus, it can be argued things either were “lost in translation” or “transformed by translation” but for the neo- Derridaists there’s the satisfaction of knowing the meaning shift was an example of “grammatology in action”.

Thursday, October 2, 2025

Mnemonic

Mnemonic (pronounced ni-mon-ik)

(1) Something assisting or intended to assist the memory.

(2) Pertaining to mnemonics or to memory.

(3) In computing, truncated code thought easy to remember (eg STO for store).

1660–1670: From the New Latin mnemonicus from the Ancient Greek μνημονικός (mnēmonikós) (of memory) derived from μνήμων (mnmōn) (remembering, mindful) & μνσθαι (mnâsthai) (to remember); the ultimate root was the primitive Indo-European men (to think).  The meaning "aiding the memory", a back-formation from mnemonics dates from 1753, the noun meaning "mnemonic device" is from 1858.  The use in computer programming emerged in the early days of code and was a space-saving (eg del rather than delete) tool as well.  Mnemonical was the original form from the 1660s.  One of the charming ironies of mnemonic is it is one of those words so many can't quite remember how to spell.  It's thus in a sense "antimnemonic" and a contronym (also as auto-antonym, antagonym, or enantiodrome) which describes a word with two opposite or contradictory meanings, depending on context.  Mnemonic is a noun & adjective, mnemonician, mnemonicalist, mnemotechnist & mnemonicon are nouns, mnemonize & mnemonized are verbs, mnemonical & mnemotechnic are adjectives and mnemonically & mnemotechnically are adverbs; the noun plural is mnemonics.

Sans Forgetica

Sans Forgetica sample text.

Recently released, Sans Forgetica (which translates as "without forgetting") is a sans-serif font developed by RMIT University in Melbourne.  Back-slanted and with gaps in the character constructions, it’s designed explicitly to assist readers better to understand and retain in their memory what they’ve read.  Perhaps counter-intuitively for those outside the field, the shape is intended to reduce legibility, thereby (1) lengthening the tame taken to read the text and (2) adding complexity to learning and absorbing what’s been read.  Together, they create what in cognitive psychology and neuroscience is called "desirable difficulty", in this case forcing (RMIT might prefer "nudging") people to concentrate.

The first three paragraphs of Lindsay Lohan's Wikipedia page, rendered in Sans Forgetica.  Sans was from the Middle English saunz & sans, from the Old French sans, senz & sens, from the Latin sine (without) conflated with absēns (absent, remote).   Forgetica was an opportunistic coining, the construct being forget + -ica.  Forget was from the Middle English forgeten, forgiten, foryeten & forȝiten, from the Old English forġietan (to forget) (which was influenced by the Old Norse geta (to get; to guess), from the Proto-West Germanic fragetan (to give up, forget).  The -ica suffix was from the Latin -ica, the neuter plural of -icus (belonging to derived from; of or pertaining to; connected with).

From usually a young age, readers become skilled at scanning text, a process helped by most publishers seeking to render their works as legible as possible.  The theory of desirable difficulty is that omitting parts of the font requires the reader to pause and process information more slowly, thus provoking an additional cognitive processing which may enhance both understanding and retention.  While the application of the science to a font is novel, there’s nothing original about Sans Forgetica as a piece of typography, it being described as a hybrid of several existing schools and within the theory, on the basis of a small-group sample of students, it’s claimed to be a balance between legibility and difficulty.  According to the documents supplied by the developer, it’s not been tested as a device for advertisers to draw people to their text, the theory of that being people scan and dismiss (without retention) the great bulk of the large, static signage which is a feature of just about every urban environment.  With Sans Forgetica, because it can’t as quickly be scanned, people will tend longer to linger and so more carefully read the whole; a memorable event itself.

The most recent revision (DSM-5-TR (2022)) to the American Psychiatric Association's (APA) Diagnostic and Statistical Manual of Mental Disorders (DSM) followed DSM-5 (2013) in refining the somewhat vague section on amnesia in both the DSM-IV (1994) & DSM-IV-TR (2000) where appeared the terms “Psychogenic amnesia” & “dissociative amnesia”, the core element of which was: “one or more episodes of inability to recall important personal information, usually of a traumatic or stressful nature, that is too extensive to be explained by ordinary forgetfulness.”  That really reflected the popular understanding and there was no clear definition of sub-types in the diagnostic criteria although in the text (not always in criteria) there was mention of localized, selective or generalized forms.  In the fifth edition, the disorder was called Dissociative Amnesia (psychogenic amnesia seems to have been replaced) and it was listed in the dissociative disorders section.  The definition still includes an “inability to recall important autobiographical information, usually of a traumatic or stressful nature, that is inconsistent with ordinary forgetting” so the popular understanding remains acknowledged but sub-types are now listed: localized (for specific event(s)), selective (some parts of the event), or generalized (identity and life history) amnesia.  Consistent with the structural revisions elsewhere in the fifth edition, the exclusion criteria was made more explicit (ie the memory loss should not be due to substances, medication, a neurological condition or better accounted for by another mental disorder) although clinician remain aware of overlap.  Significantly the DSM-5 did clarify that amnesia is retrograde (loss of pre-existing memories), especially of autobiographical kind and emphasised the memory loss is “beyond what is expected from normal forgetting. Because in such matters, there will be so much variation between patients, it remains one of those conditions with fuzzy boundaries and the symptoms presented must be assessed on a case-by-case basis.

Amnesia (memory loss) is much studied and although associated with the aging process, traumatic events (brain injury or psychological impacts) and certain neurological conditions, there have been some celebrated cases of recovery without medical intervention.  One celebrated case was that of Rudolf Hess (1894–1987; Nazi Deputy Führer 1933-1941) who in 1941 (on the eve of Germany invading the USSR) flew himself to Scotland in a bizarre and unauthorized attempt to negotiate a peace deal with those in the UK he though would be "reasonable men".    His "offer" was rejected and he was locked up (including two weeks in the Tower of London), later to be sent as a defendant before the IMT (International Military Tribunal) in the first Nuremberg Trial (1945-1946).  There, so convincing were his symptoms of amnesia and other mental states the judges requested submissions from defence and prosecution counsel on the matter of his fitness to stand trial.  The prosecutors assured the bench Hess would be able to both understand and cope with the proceedings and that an imperfect memory was merely a hindrance to his defence rather than an insuperable obstacle.  This was of course a predictable argument and the judges acceded to the defence’s request for a thorough medical investigation although they declined the suggestion Swiss doctors be consulted, assembling instead a team from medical staff on hand (three Soviet, three American, three British and one French), all from the nations running the trial.  The physicians presented four national papers which broadly were in agreement: Hess was sane (as legally defined) but was suffering from hysterical amnesia, induced by his need to escape from uncomfortable realities, something they found was often typical of “those with Hess’s unstable personality”.  All concluded the amnesia was temporary and would vary in intensity, the US doctors suggesting it may even disappear were any threat of punishment removed.

Caricature of Rudolf Hess at the first Nuremberg Trial by New Zealand-born UK cartoonist David Low (1891-1963).

The author Rebecca West (1892–1983) covered the trial as a journalist and wrote some vivid thumbnail sketches, noting of Hess: “Hess was noticeable because he was so plainly mad: so plainly mad that it seemed shameful that he should be tried.  His skin was ashen and he had that odd faculty, peculiar to lunatics, of falling into strained positions which no normal person could maintain for more than a few minutes, and staying fixed in contortion for hours. He had the classless air characteristic of asylum inmates; evidently his distracted personality had torn up all clues to his past.  He looked as if his mind had no surface, as if every part of it had been blasted away except the depth where the nightmares live.”  Whether or not Hess was "mad" (as such folk were described in 1946) can be debated but to many at the time, he certainly looked a madman.

Predictably unconvinced, Hess’s counsel at a hearing on 30 November 1945 told the bench a defendant could hardly stage an adequate defence if unable to remember names or incidents vital to his case, adding that on the basis of discussions with his client, even if he understood the words, Hess was incapable of grasping the significance of the charges against him.  Nor would a trial in absentia be fair because it would constituent a “grave injustice” were a defendant not present to give evidence or challenge the testimony of witnesses.  He concluded by requesting proceedings against him should be suspended and resumed only if his condition significantly improved.  To that, the British countered with a lengthy lecture on the distinctions in English law between amnesia & insanity and seconded the Soviet view that participation in the trial (and thus the need to make a defence) might well cure his condition.  Essentially, the British argued if he could follow the proceedings, he was fit to stand trial.  The US team noted Hess had at times claimed to be suffering amnesia while in captivity in England between 1941-1945 and on other occasions admitted the condition was simulated.  In the slang of the English criminal bar: “He had a bit of previous”.  The Americans also expressed annoyance at him having repeatedly refused any of the treatment prescribed by the Allied doctors, concluding: “He is in the volunteer class with his amnesia”.  The lawyers having finished, the IMT asked Hess if he wished to speak on the matter.  Without delay, he rose in the dock and walked to the microphone where he addressed the court in a clear and calm voice, his statement coherent, unambiguous and, most historians have concluded, clearly premeditated: “Henceforth my memory will again respond to the outside world.  The reasons for simulating loss of memory were of a tactical nature.  Only my ability to concentrate is, in fact, somewhat reduced.  But my capacity to follow the trial, to defend myself, to put questions to witnesses, or to answer questions myself is not affected thereby.  I also simulated loss of memory in consultations with my officially appointed defence counsel. He has therefore represented in good faith.

He then sat down in what was described as a “stunned courtroom”.  It was at that point the trial’s most sensational moment and after taking a few seconds to digest things, the assembled press pack in their dozens rushed outside to file the story (the US military newspaper Stars and Stripes ran the punchy headline “Hess Nuts. Fake Story Fake”).  Immediately, the president of the IMT adjourned the session and the judges went into private session to decide whether Hess should be tried.  From their subsequent interviews and writings it appears they were not much influenced by Hess’s unexpected statement but were impressed by the similarity of the conclusions offered by the doctors, the chief US prosecutor saying such “unanimity of medical opinion” was, in his experience: “historically unique”.  All eight judges agreed Hess was fit to stand trial and, after being convicted on two counts ((1) conspiracy to wage aggressive war and (2) waging aggressive war), he was handed a life sentence and would remain incarcerated until in 1987 he committed suicide after some 46 years behind bars, the last two decades of which were served as the sole inmate (guarded by dozens of soldiers on rotation from France, the UK, US and USSR) of Berlin’s sprawling Spandau Prison, a huge facility designed to incarcerate hundreds.

Low’s take on the official German line explaining Hess deserting the German government as “madness”.  This cartoon does represent what was then the prevailing public perception of the typical appearance expected of those in “lunatic asylums”.  Depicted (left to right) are:

Hermann Göring (1893–1946; leading Nazi 1922-1945, Hitler's designated successor & Reichsmarschall 1940-1945): Committed suicide by by crushing between his teeth an ampule of a potassium cyanide (KCN), smuggled into his cell in circumstances never confirmed, shortly before he was to be hanged after being convicted on all four counts ((1) Conspiracy to wage aggressive war; (2) Waging aggressive war; (3) War crimes and (4) Crimes against humanity.

Adolf Hitler (1889-1945; Führer (leader) and German head of government 1933-1945 & head of state 1934-1945): With his wife Eva (née Braun; 1912–1945) of a few hours, committed suicide (he by gunshot and KCN, she by KCN alone) with the tanks of the Red Army only a couple of blocks from the Berlin Führerbunker.

Dr Robert Ley (1890–1945; head of the Deutsche Arbeitsfront (German Labour Front) 1933-1945): Before the trial began, he committed suicide by hanging (by means of suffocation) himself from the toilet-pipe in his cell in Nuremberg, after having for some years made a reasonable attempt to drink himself to death.  He died with his underpants stuffed in his mouth, decades before the phrase "Eat my shorts!" began to circulate in popular culture.

Joachim von Ribbentrop (1893–1946; Nazi foreign minister 1938-1945): Hanged at Nuremberg after being convicted on all four counts.

Dr Joseph Goebbels: With his wife (Magda Goebbels (née Ritschel; 1901-1945), committed suicide (by gunshot) in the courtyard above the Führerbunker, shortly after they’d murdered their six young children.

Heinrich Himmler (1900–1945; Reichsführer SS 1929-1945): Captured by the British while attempting to escape disguised as a soldier, he committed suicide using an ampule of KCN concealed in his mouth.

Whether Hess was at any point insane (in the legal or medical sense) remains debated although, as is often the case, more interesting still is the speculation about just when the instability began.  Whether any credence can be attached to the official statement on the matter from the Nazi Party is doubtful but in the view of Reich Chancellery, his madness predated his flight to Scotland in 1941 (one of the strangest incidents of World War II (1939-1945)).  What the German press was told to publish was that Hess had become "deluded and deranged", his mental health affected by injuries sustained during World War I (1914-1918) and that he'd fallen under the influence of astrologers.  Just to make that sound convincing, the police conducted a crackdown (a well oiled technique in the Nazi state) on soothsayers and fortune-tellers.  Dr Joseph Goebbels (1897-1975; Nazi propaganda minister 1933-1945) wasn't consulted before the "madness" explanation was announced and he seems to have been the only senior figure in the regime to grasp the potential implications of revealing to the public that for some time the country's deputy leader had been mad.  Others though did make the connection.  When Hermann Göring tried to shift the blame to aircraft designer and manufacturer Willy Messerschmitt (1898–1978) because he'd provided Hess a twin-engined Bf 110 Zerstörer (destroyer (heavy fighter)) for his flight, the engineer responded by saying Göring was more culpable because he should have done something about having someone unstable serving as Deputy Führer.  Göring could only laugh and told Messerschmitt to go back to building warplanes and, as it turned out, the strange affair was but a "nine day wonder" for not only did the British make no attempt to use Hess's arrival on their soil for propaganda purposes (which astonished Goebbels) but other events would soon dominate the headlines.  The only place where the strange flight left a great impression was in the Kremlin where comrade Stalin (1878-1953; Soviet leader 1924-1953) for years mulled over who within the British establishment might have conspired with Hess to allow the UK to withdraw from the conflict, leaving Germany able to invade Russia without having to fight on two fronts.  Historians have concluded the reluctance by the British to use for propaganda the arrival of Hess was their concern comrade Stalin might suspect collusion. 

Arthur Sinodinos, b 1957; Liberal Party functionary and minister variously 2007-2019; Australian ambassador to the US 2019-2023, right ) presenting to Donald Trump (b 1946; US president 2017-2021 and since 2025, left) his credentials as Australia's ambassador to the US, the White House, Washington DC, February 2020.

Less dramatic but perhaps medically even more remarkable than the Hess affair was the recovery from amnesia by Arthur Sinodinos, a case which deserves to enter the annals of academic psychiatry & neurology (and debatably, those of the thespians).  In Australia, royal commissions are public investigations, established by but independent of government.  Not a court, royal commissions are created to enquire into matters of public importance and, within their terms of reference, have broad powers to conduct public & in camera hearings; they can call witnesses, compelling them (under oath) to provide testimony and they deliver recommendations to government about what should be done, consequent upon their findings.  These can include recommendations for legislative or administrative changes and the prosecution of institutions or individuals and they’re of great interest because they appear to be the only institution (at least theoretically) able to compel a politician to tell the truth.  Even that power is limited though because when appearing before royal commissions, politicians seem especially prone to suffering amnesia, an obviously distressing condition which compels them frequently to utter phrases like “I can’t remember”, “I don’t recall”, “not in my recollection” etc.  In the lore of the New South Wales (NSW) bar, Mr Sinodinous, while in 2014 being questioned by an enquiry, is believed to have set a record for the frequency with which the condition manifested.  Fortunately, the enquiry handed down no adverse findings against him and almost immediately, his memory appeared miraculously to recover, enabling the Australian Liberal Party government to appoint him ambassador to the US in 2019 so there's that.  The following transcript is wholly fake news:

Donald Trump: "What did you and Joe Biden talk about?"

Arthur Sinodinous: "I can't remember."

Donald Trump: "Not to worry, he won't remember either."

In the rich slang of the NSW bar, the condition once known as RCM (Royal Commission Memory) is now also referred to as “Sinodinos Syndrome”, on the model of “Marcinkus Syndrome” which describes the medical status of Roman Catholic priests who, being investigated for this, that or the other, although seemingly fit and healthy, are never able to be certified quite well enough to be interviewed by police or other authorities.  The condition is named after Archbishop Paul Marcinkus (1922–2006; President of the Institute for the Works of Religion (the “Vatican Bank’) 1971-1989).