Showing posts sorted by date for query Wonder. Sort by relevance Show all posts
Showing posts sorted by date for query Wonder. Sort by relevance Show all posts

Saturday, July 5, 2025

Futurism

Futurism (pronounced fyoo-chuh-riz-uhm)

(1) A movement in avant-garde art, developed originally by a group of Italian artists in 1909 in which forms (derived often from the then novel cubism) were used to represent rapid movement and dynamic motion  (sometimes with initial capital letter)

(2) A style of art, literature, music, etc and a theory of art and life in which violence, power, speed, mechanization or machines, and hostility to the past or to traditional forms of expression were advocated or portrayed (often with initial capital letter).

(3) As futurology, a quasi-discipline practiced by (often self-described) futurologists who attempt to predict future events, movements, technologies etc.

(4) In the theology of Judaism, the Jewish expectation of the messiah in the future rather than recognizing him in the presence of Christ.

(5) In the theology of Christianity, eschatological interpretations associating some Biblical prophecies with future events yet to be fulfilled, including the Second Coming.

1909: From the Italian futurismo (literally "futurism" and dating from circa 1909), the construct being futur(e) + -ism.  Future was from the Middle English future & futur, from the Old French futur, (that which is to come; the time ahead) from the Latin futūrus, (going to be; yet to be) which (as a noun) was the irregular suppletive future participle of esse (to be) from the primitive Indo-European bheue (to be, exist; grow).  It was cognate with the Old English bēo (I become, I will be, I am) and displaced the native Old English tōweard and the Middle English afterhede (future (literally “afterhood”) in the given sense.  The technical use in grammar (of tense) dates from the 1520s.  The –ism suffix was from the Ancient Greek ισμός (ismós) & -isma noun suffixes, often directly, sometimes through the Latin –ismus & isma (from where English picked up ize) and sometimes through the French –isme or the German –ismus, all ultimately from the Ancient Greek (where it tended more specifically to express a finished act or thing done).  It appeared in loanwords from Greek, where it was used to form abstract nouns of action, state, condition or doctrine from verbs and on this model, was used as a productive suffix in the formation of nouns denoting action or practice, state or condition, principles, doctrines, a usage or characteristic, devotion or adherence (criticism; barbarism; Darwinism; despotism; plagiarism; realism; witticism etc).  Futurism, futurology, & futurology are nouns, futurist is a noun & adjective and futuristic is an adjective; the noun plural is futurisms.

Lindsay Lohan in Maison Martin Margiela (b 1957) Futuristic Eyewear.

As a descriptor of the movement in art and literature, futurism (as the Italian futurismo) was adopted in 1909 by the Italian poet Filippo Tommaso Marinetti (1876-1944) and the first reference to futurist (a practitioner in the field of futurism) dates from 1911 although the word had been used as early as 1842 in Protestant theology in the sense of “one who holds that nearly the whole of the Book of Revelations refers principally to events yet to come”.  The secular world did being to use futurist to describe "one who has (positive) feelings about the future" in 1846 but for the remainder of the century, use was apparently rare.  The (now probably extinct) noun futurity was from the early seventeenth century.  The noun futurology was introduced by Aldous Huxley (1894-1963) in his book Science, Liberty and Peace (1946) and has (for better or worse), created a minor industry of (often self-described) futurologists.  In theology, the adjective futuristic came into use in 1856 with reference to prophecy but use soon faded.  In concert with futurism, by 1915 it referred in art to “avant-garde; ultra-modern” while by 1921 it was separated from the exclusive attachment to art and meant also “pertaining to the future, predicted to be in the future”, the use in this context spiking rapidly after World War II (1939-1945) when technological developments in fields such as ballistics, jet aircraft, space exploration, electronics, nuclear physics etc stimulated interest in such progress.

Untouched: Crooked Hillary Clinton (b 1947; US secretary of state 2009-2013) & Bill Clinton (b 1946; US president 1993-2001) with cattle, 92nd Annual Hopkinton State Fair, Contoocook, New Hampshire, September 2007.

Futures, a financial instrument used in the trade of currencies and commodities appeared first in 1880; they allow (1) speculators to bet on price movements and (2) producers and sellers to hedge against price movements and in both cases profits (and losses) can be booked against movement up or down.  Futures trading can be lucrative but is also risky, those who win gaining from those who lose and those in the markets are usually professionals.  The story behind crooked Hillary Clinton's extraordinary profits in cattle futures (not a field in which she’d previously (or has subsequently) displayed interest or expertise) while “serving” as First Lady of Arkansas ((1979–1981 & 1983–1992) remains murky but it can certainly be said that for an apparently “amateur” dabbling in a market played usually by experienced professionals, she was remarkably successful and while perhaps there was some luck involved, her trading record was such it’s a wonder she didn’t take it up as a career.  While many analysts have, based on what documents are available, commented on crooked Hillary’s somewhat improbable (and apparently sometime “irregular”) foray into cattle futures, there was never an “official governmental investigation” by an independent authority and no thus adverse findings have ever been published.  

The Arrival (1913), oil on canvas by Christopher Richard Wynne Nevinson (1889-1946), Tate Gallery.

Given what would unfold over during the twentieth century, it’s probably difficult to appreciate quite how optimistic was the Western world in the years leading up to the World War I (1914-1918).  Such had been the rapidity of the discovery of novelties and of progress in so many fields that expectations of the future were high and, beginning in Italy, futurism was a movement devoted to displaying the energy, dynamism and power of machines and the vitality and change they were bringing to society.  It’s also often forgotten that when the first futurist exhibition was staged in Paris in 1912, the critical establishment was unimpressed, the elaborate imagery with its opulence of color offending their sense of refinement, now so attuned to the sparseness of the cubists.

The Hospital Train (1915), oil on canvas by Gino Severini (1883-1966), Stedelijk Museum.

Futurism had debuted with some impact, the Paris newspaper Le Figaro in 1909 publishing the manifesto by Italian poet Filippo Tommaso Marinetti. Marinetti which dismissed all that was old and celebrated change, originality, and innovation in culture and society, something which should be depicted in art, music and literature. Marinetti exalted in the speed, power of new technologies which were disrupting society, automobiles, aeroplanes and other clattering machines.  Whether he found beauty in the machines or the violence and conflict they delivered was something he left his readers to decide and there were those seduced by both but his stated goal was the repudiation of traditional values and the destruction of cultural institutions such as museums and libraries.  Whether this was intended as a revolutionary roadmap or just a provocation to inspire anger and controversy is something historians have debated.  Assessment of Marinetti as a poet has always been colored by his reputation as a proto-fascist and some treat as "fake mysticism" his claim his "visions" of the future and the path to follow to get there came to him in the moment of a violent car crash. 

Futurismo: Uomo Nuovo (New Man, 1918), drawing by Mario Sironi (1885-1961).

As a technique, the futurist artists borrowed much from the cubists, deploying the same fragmented and intersecting plane surfaces and outlines to render a number of simultaneous, overlaid views of an object but whereas the cubists tended to still life, portraiture and other, usually static, studies of the human form, the futurists worshiped movement, their overlays a device to depict rhythmic spatial repetitions of an object’s outlines during movement.  People did appear in futurist works but usually they weren’t the focal point, instead appearing only in relation to some speeding or noisy machine.  Some of the most prolific of the futurist artists were killed in World War I and as a political movement it didn’t survive the conflict, the industrial war dulling the public appetite for the cult of the machine.  However, the influence of the compositional techniques continued in the 1920s and contributed to art deco which, in more elegant form, would integrate the new world of machines and mass-production into motifs still in use today.

Motociclista (Motorcyclist, circa 1924), oil on canvas by Mario Sironi.

By the early twentieth century when the Futurism movement emerged, machines and mechanism were already hundreds of years old (indeed the precursor devices pre-date Christ) but what changed was the new generations of machines had become sexy (at least in the eyes of men), associated as they were with something beyond mere functionalism: speed and style.  While planes, trains & automobiles all attracted the futurists, the motorcycle was a much-favored motif because it possessed an intimacy beyond other forms of transportation in that, literally it was more an extension of the human body, the rider at speed conforming to the shape of the structure fashioned for aerodynamic efficiency with hands and feet all directly attached to the vital controls: machine as extension of man.

The Modern Boy No. 100, Vol 4, Week Ending 4 January, 1930.

The Modern Boy (1928-1939) was, as the name implies, a British magazine targeted at males aged 12-18 and the content reflected the state of mind in the society of the inter-war years, the 1930s a curious decade of progress, regression, hope and despair.  Although what filled much of the pages (guns, military conquest and other exploits, fast cars and motorcycles, stuff the British were doing in other peoples’ countries) would today see the editors cancelled or visited by one of the many organs of the British state concerned with the suppression of such things), it was what readers (presumably with the acquiescence of their parents) wanted.  Best remembered of the authors whose works appeared in The Modern Boy was Captain W.E. Johns (1893–1968), a World War I RFC (Royal Flying Corps) pilot who created the fictional air-adventurer Biggles.  The first Biggles tale appeared in 1928 in Popular Flying magazine (released also as Popular Aviation and still in publication as Flying) and his stories are still sometimes re-printed (although with the blatant racism edited out).  The first Biggles story had a very modern-sounding title: The White FokkerThe Modern Boy was a successful weekly which in 1988 was re-launched as Modern Boy, the reason for the change not known although dropping superfluous words (and much else) was a feature of modernism.  In October 1939, a few weeks after the outbreak of World War II, publication ceased, Modern Boy like many titles a victim of restrictions by the Board of Trade on the supply of paper for civilian use.

Jockey Club Innovation Tower, Hong Kong (2013) by Zaha Hadid (1950-2016).

If the characteristics of futurism in art were identifiable (though not always admired), in architecture, it can be hard to tell where modernism ends and futurism begins.  Aesthetics aside, the core purpose of modernism was of course its utilitarian value and that did tend to dictate the austerity, straight lines and crisp geometry that evolved into mid-century minimalism so modernism, in its pure form, should probably be thought of as a style without an ulterior motive.  Futurist architecture however carried the agenda which in its earliest days borrowed from the futurist artists in that it was an assault on the past but later moved on and in the twenty-first century, the futurist architects seem now to be interested above all in the possibilities offered by advances in structural engineering, functionality sacrificed if need be just to demonstrate that something new can be done.  That's doubtless of great interest at awards dinners where architects give prizes to each other for this and that but has produced an international consensus that it's better to draw something new than something elegant.  The critique is that while modernism once offered “less is more”, with neo-futurist architecture it's now “less is bore”.  Art deco and mid-century modernism have aged well and it will be interesting to see how history judges the neo-futurists.

Monday, June 2, 2025

Asperger

Asperger (pronounced a-spuh-guh or a-spr-gr)

(1) In neo-paganism and modern witchcraft, a ceremonial bundle of herbs or a perforated object used to sprinkle water (in spells as “witches water”), usually at the commencement of a ritual.

(2) In neurology, as Asperger's syndrome (less commonly Asperger syndrome), an autism-related developmental disorder characterised by sustained impairment in social interaction and non-verbal communication and by repetitive behaviour as well as restricted interests and routines.  The condition was named after Austrian pediatrician Hans Asperger (1906–1980).

Pre-1300: The surname Asperger was of German origin and was toponymic (derived from a geographical location or feature).  The town of Asperg lies in what is now the district of Ludwigsburg, Baden-Württemberg, in south-west Germany and in German, appending the suffix “-er” can denote being “from a place”, Asperger thus deconstructs as “someone from Asperg” and in modern use would suggest ancestral ties to the town of Asperg or a similar-sounding locality.  Etymologically, Asperg may be derived from older Germanic or Latin roots, possibly meaning “rough hill” or “stony mountain” (the Latin asper meaning “rough” and the German berg meaning “mountain or hill”.  The term “Asperger’s syndrome” was in 1976 coined by English psychiatrist Lorna Wing (1928–2014), acknowledging the work of Austrian pediatrician Hans Asperger (1906–1980).  Dr Wing was instrumental in the creation of the National Autistic Society, a charity which has operated since 1962.  Asperger is a noun (capitalized if in any context used as a proper noun).  Aspergerian & Aspergic are nouns; the noun plural forms being Aspergers, Aspergerians & Aspergics.  In the literature, Aspergerian & Aspergic (of, related to, or having qualities similar to those of Asperger's syndrome (adjective) & (2) someone with Asperger's syndrome (noun)) appear both to have been used.  In general use “Asperger's” was the accepted ellipsis of Asperger's syndrome while the derogratory slang forms included Aspie, autie, aspie, sperg, sperglord & assburger, now all regarded as offensive in the same way “retard” is now proscribed.

The noun asperges described a sprinkling ritual of the Catholic Church, the name was applied also to an antiphon intoned or sung during the ceremony.  It was from the Late Latin asperges, noun use of second-person singular future indicative of aspergere (to scatter, strew upon, sprinkle), the construct being ad (to, towards, at) + spargere (to sprinkle).  The use in Church Latin was a learned borrowing from Latin aspergō (to scatter or strew something or someone; to splash over; to spot, stain, sully, asperse; besmirch; (figuratively) to bestow, bequeath something to, set apart for) the construct being ad- +‎ spargō (strew, scatter; sprinkle; moisten).  The origin lay in the phrase Asperges me, Domine, hyssopo et mundabor (Thou shalt sprinkle me, O Lord, with hyssop, and I shall be cleansed), from the 51st Psalm (in the Vulgate), sung during the rite of sprinkling a congregation with holy water.  Hyssop (any of a number of aromatic bushy herbs) was from the Latin hȳsōpum, from the Ancient Greek ὕσσωπος (hússōpos), of Semitic origin and the idea was would be cleansed of one’s sins.  In the Old English the loan-translation of the Latin aspergere was onstregdan.

The three most recent popes demonstrate their aspergillum (also spelled aspergill) technique while performing the sprinkling rite.  In the more elaborate rituals, it's often used in conjunction with a container called an aspersorium (holy water bucket).  Benedict XVI (1927–2022; pope 2005-2013, pope emeritus 2013-2022, left), Francis (1936-2025; pope 2013-2025, centre) and Leo XIV (b 1955; pope since 2025, right).

In the Christian liturgy, an aspergillum was used to sprinkle holy water and the borrowing, adaptation and re-purposing of ceremonies, feasts days and such from paganism widely was practiced by the early Church.  In the Bible (notably chapter 14 in the Old Testament’s Book of Leviticus) there are descriptions of purification rituals involving the use of cedar wood, hyssop, and scarlet wool to create an instrument for sprinkling blood or water and historians sometimes cite this as “proto-aspergillum”.  While it seems the earliest known use on English of “aspergillum” dates from 1649, the documentary evidence is clear the practice in the Christian liturgy was ancient and common since at least the tenth century.  Exactly when the ritualistic practice began isn’t known but because water is so obviously something used “to cleanse”, it’s likely it has been a part of religious rituals for millennia before Christianity.

The use of the “asperger” in neo-paganism & witchcraft was a continuation of the concept and well documented in the remarkably prolific literature (some book shops have dedicated sections) devoted to modern witchcraft and the construction of the objects (a bundle of fresh herbs or a perforated object for sprinkling water) is a lineal descendent of the aspergillum of the Medieval church and that makes sense, both institutions devoted to the process of cleansing although the targets may have differed.  According to Ancient Pathways Witchcraft (which sounds an authoritative source), although it’s the fluid which does the cleansing, the asperger is significant because it symbolizes “the transformative and cleansing properties of water…”, rinsing away “…spiritual debris that might interfere with the sanctity of rituals.  In both neo-paganism and witchcraft, the herbs used may vary and while, pragmatically, sometimes this was dictated by seasonal or geographical availability, priests and witches would also choose the composition based on some “unique essences” being better suited to “enhance the sacred water's effectiveness”.  Nor were herbs always used for, as in the rituals of the church, “an asperger might be a metal or wooden rod designed with perforations or an attached mesh”, something like a “small brush or a dedicated holy water sprinkler akin to those seen in Christian liturgy.  Again, it was the sprinkling of the water which was the critical element in the process, the devices really delivery systems which, regardless of form, existed to transform simple water into “a divine medium of purity and transformation.  That said, their history of use did vest them with tradition, especially when certain herbs were central to a spell.

Dr Hans Asperger at work, Children's Clinic, University of Vienna, circa 1935.

The term “Asperger’s syndrome” first appeared in a paper by English psychiatrist Lorna Wing (1928–2014) although use seems not to have entered the medical mainstream until 1981.  Dr Wing (who in 1962 was one of the founders of the charitable organization the National Autistic Society) named it after Austrian pediatrician Hans Asperger (1906–1980) who first described the condition in 1944, calling it autistischen psychopathen (autistic psychopathy).  Dr Wing was instrumental in the creation of the National Autistic Society, a charity which has operated since 1962.  The German autistischen was an inflection of autistisch (autistic), the construct being Autist (autistic) +‎ -isch (an adjectival suffix).

The English word autism was from the German Autismus, used in 1913 by Swiss psychiatrist and eugenicist Eugen Bleuler (1857-1939), the first known instance dating from 1907 and attributed by Swiss psychiatrist & psychotherapist Carl Jung (1875-1961) as an alternative to his earlier “auto-erotism” although in his book Dementia Praecox, oder Gruppe der Schizophrenien (Precocious Dementia, or Group of Schizophrenias, 1911) Bleuler differentiated the terms.  The construct of the word was the Ancient Greek αὐτός (autos) (self) + -ισμός (-ismós) (a suffix used to form abstract nouns of action, state or condition equivalent to “-ism”).  Being a time of rapid advances in the relatively new discipline of psychiatry, it was a time also of linguistic innovation, Dr Bleuler in a Berlin lecture in 1908 using the term “schizophrenia”, something he’d been using in Switzerland for a year to replace “dementia praecox”, coined by German psychiatrist Emil Kraepelin's (1856-1926).  What Dr Bleuler in 1913 meant by “autistic” was very different from the modern understanding in that to him it was a symptom of schizophrenia, not an identifiably separate condition.  In the UK, the profession picked this up and it was used to describe “a tendency to turn inward and become absorbed in one's own mental and emotional life, often at the expense of connection to the external world” while “autistic thinking” referred to those who were “self-absorbed, fantasy-driven, and detached from reality; thinking patterns, commonly seen in those suffering schizophrenia.

Looking Up was the monthly newsletter of the International Autism Association and in Volume 4, Number 4 (2006), it was reported Lindsay Lohan’s car had blocked the drop-off point for Smashbox Cares, a charity devoted to teaching surfing to autistic youngsters.  Arriving at the designated spot at Malibu’s Carbon Beach, the volunteers were delayed in their attempt to disembark their charges, something of significance because routine and predictability is important to autistic people.  To make up for it, Ms Lohan staged an impromptu three hour beach party for the children, appearing as a bikini-clad DJ.  Apparently, it was enjoyed by all.

The modern sense of “autistic” began to emerge in the 1940s, among the first to contribute the Austrian-American psychiatrist Leo Kanner (1894–1981) who in 1943 published a paper using the phrase “early infantile autism” to describe a distinct syndrome (which now would be understood as autism spectrum disorder).  The following year, in Vienna, Dr Asperger wrote (seemingly influenced by earlier work in Russia) of his observational studies of children, listing the behaviors he associated with the disorder and unlike some working in the field during the 1940s, Dr Asperger wasn’t wholly pessimistic about his young patients, writing in Autistic Psychopathy in Childhood (1944): “The example of autism shows particularly well how even abnormal personalities can be capable of development and adjustment. Possibilities of social integration which one would never have dreamt of may arise in the course of development.  Many of the documents associated with Dr Asperger’s work were lost (or possibly taken to the Soviet Union) in the chaotic last weeks of World War II (1939-1945) and it wasn’t until Dr Wing in the 1970s reviewed some material from the archives that his contributions began to be appreciated although not until 1992 did “Asperger’s Syndrome” became a standard diagnosis.

DSM IV (1994).  Not all in the profession approved of the reclassification of Asperger’s syndrome under the broader Autism Spectrum Disorder, believing it reduced the depth of diagnostic evaluation, flattened complexity and was disconnected from clinical reality.  There was also regret about structural changes, DSM-5 eliminating the multiaxial system (Axes I–V), which some clinicians found useful for organizing information about the patient, especially Axis II (personality disorders) and Axis V (Global Assessment of Functioning).

Asperger’s Syndrome first appeared in the American Psychiatric Association's (APA) classification system when it was added to the fourth edition of the Diagnostic and Statistical Manual of Mental Disorders (DSM-IV, 1994) and the utility for clinicians was it created a sub-group of patients with autism but without a learning disability (ie characterized by deficits in social interaction and restricted interests, in the absence of significant language delay or cognitive impairment), something with obvious implications for treatment.  In the DSM-5 (2013), Autism Spectrum Disorder (ASD) was re-defined as a broader category which combined Asperger syndrome, Autistic Disorder & PDD-NOS (Pervasive Developmental Disorder Not Otherwise Specified) into a single ASD diagnosis, the editors explaining the change as a reflection of an enhanced understanding of the condition, the emphasis now on it being something with varying degrees of severity and presentation rather than distinct types.

However, although after 2013 the term no longer appeared in the DSM, it has remained in popular use, the British military historian Sir Antony Beevor (b 1946) in Ardennes 1944 (2015, an account of the so-called "Battle of the Bulge") speculating of Field Marshal Bernard Montgomery (First Viscount Montgomery of Alamein, 1887–1976) that "one might almost wonder whether [he] suffered from what today would be called high-functioning Asperger syndrome.The eleventh release of the World Health Organization’s (WHO) International Classification of Diseases (ICD) (ICD-11) aligned with the DSM-5 and regards what once would have been diagnosed as Asperger’s Syndrome to be deemed a relatively mild manifestation of ASD.  The diagnostic criteria for ASD focus on deficits in social communication and interaction, as well as repetitive behaviors and interests.  Although no longer current, the DSM IV’s criteria for Asperger's Disorder remain of interest because while the label is no longer used, clinicians need still to distinguish those in the spectrum suffering some degree of learning disability and those not so affected:

DSM-IV diagnostic criteria for Asperger’s Disorder (299.80).

A. Qualitative impairment in social interaction, as manifested by at least two of the following:

(1) marked impairments in the use of multiple nonverbal behaviors such as eye-to-eye gaze, facial expression, body postures, and gestures to regulate social interaction.

(2) failure to develop peer relationships appropriate to developmental level.

(3) a lack of spontaneous seeking to share enjoyment, interests, or achievements with other people (eg by a lack of showing, bringing, or pointing out objects of interest to other people).

(4) lack of social or emotional reciprocity.

B. Restricted repetitive and stereotyped patterns of behavior, interests, and activities, as manifested by at least one of the following:

(1) encompassing preoccupation with one or more stereotyped and restricted patterns of interest that is abnormal either in intensity or focus.

(2) apparently inflexible adherence to specific, non-functional routines or rituals.

(3) stereotyped and repetitive motor mannerisms (eg hand or finger flapping or twisting, or complex whole-body movements).

(4) persistent preoccupation with parts of objects.

C. The disturbance causes clinically significant impairment in social, occupational, or other important areas of functioning

D. There is no clinically significant general delay in language (eg single words used by age 2 years, communicative phrases used by age 3 years).

E. There is no clinically significant delay in cognitive development or in the development of age-appropriate self-help skills, adaptive behavior (other than social interaction), and curiosity about the environment in childhood.

F. Criteria are not met for another specific Pervasive Developmental Disorder or Schizophrenia.

The term in the twenty-first century became controversial after revelations of some of Dr Asperger's activities during the Third Reich (Austria annexed by Germany in 1938) which included his clinic in Vienna sending selected children to be victims of Aktion T4 (a mass-murder programme of involuntary euthanasia targeting those with disabilities), an operation which ran at times in parallel with the programmes designed to exterminate the Jews, Gypsies, homosexuals and others.  While there is no surviving documentary evidence directly linking Dr Asperger to the selection process which decided which children were to be killed, researchers have concluded the records suggest his construction of what came later to be called “Asperger’s syndrome” was actually that very process with an academic gloss.  Because those Dr Asperger so categorized were the autistic children without learning difficulties, they were thus deemed capable of being “cured” and thus spared from the T4’s lists, unlike the “uneducable” who would never be able to be made into useful German citizens.  While the surviving material makes clear Dr Asperger was at least a “fellow traveller” with the Nazi regime, in professional, artistic and academic circles there was nothing unusual or even necessarily sinister about that because in a totalitarian state, people have few other choices if they wish to avoid unpleasantness.  However, it does appear Dr Asperger may have been unusually co-operative with the regime and his pre-1945 publication record suggests sympathy with at least some aspects of the Nazis’ racial theories and eugenics.

Friday, May 30, 2025

Tatterdemalion

Tatterdemalion (pronounced tat-er-di-meyl-yuhn or tat-er-di-mal-yuhn)

(1) A person in tattered clothing; a shabby person.

(2) Ragged; unkempt or dilapidated.

(3) In fashion, (typically as “a tatterdemalion dress” etc), garments styled deliberately frayed or with constructed tears etc (also described as “distressed” or “destroyed”).

(4) A beggar (archaic).

1600–1610: The original spelling was tatter-de-mallian (the “demalion” rhymed with “Italian” in English pronunciation), the construct thus tatter + -demalion, of uncertain origin although the nineteenth century English lexicographer Ebenezer Cobham Brewer (1810-1897) (remembered still for his marvelous Dictionary of Phrase and Fable (1894) suggested it might be from de maillot (shirt) which does seem compelling.  Rather than the source, tatter is thought to have been a back-formation from tattered, from the Middle English tatered & tatird, from the Old Norse tǫturr.  Originally, it was derived from the noun, but it was later re-analysed as a past participle (the construct being tatter + -ed) and from this came the verb.  As a noun a tatter was "a shred of torn cloth or an individual item of torn and ragged clothing" while the verb implied both (as a transitive) "to destroy an article of clothing by shredding" & (as an intransitive) "to fall into tatters".  Tatterdemalion is a noun & adjective and tatterdemalionism is a noun; the noun plural is tatterdemalions.

In parallel, there was also the parallel "tat", borrowed under the Raj from the Hindi टाट (ā) (thick canvas) and in English it assumed a variety of meanings including as a clipping of tattoo, as an onomatopoeia referencing the sound made by dice when rolled on a table (and came to be used especially of a loaded die) and as an expression of disapprobation meaning “cheap and vulgar”, either in the context of low-quality goods or sleazy conduct.  The link with "tatty" in the sense of “shabby or ragged clothing” however apparently comes from tat as a clipping of the tatty, a woven mat or screen of gunny cloth made from the fibre of the Corchorus olitorius (jute plant) and noted for it loose, scruffy-looking weave.  Tatterdemalion is a noun & adjective; the noun plural is tatterdemalions.  The historic synonyms were shoddy, battered, broken, dilapidated, frayed, frazzled, moth-eaten, ragged, raggedy, ripped, ramshackle, rugged, scraggy, seedy, shabby, shaggy, threadbare, torn & unkempt and in the context of the modern fashion industry, distressed & destroyed.  An individual could also be described as a tramp, a ragamuffin, a vagabond, a vagrant, a gypsy or even a slum, some of those term reflecting class and ethnic prejudice or stereotypes.  Historically, tatterdemalion was also a name for a beggar.

A similar word in Yiddish was שמאַטע‎ (shmate or shmatte and spelled variously as schmatte, schmata, schmatta, schmate, schmutter & shmatta), from the Polish szmata, of uncertain origin but possibly from szmat (a fair amount).  In the Yiddish (and as adopted in Yinglish) it meant (1) a rag, (2) a piece of old clothing & (3) in the slang of the clothing trade, any item of clothing.  That was much more specific than the Polish szmata which meant literally "rag or old, ripped piece of cloth" but was used also figuratively to mean "publication of low journalistic standard" (ie analogous the English slang use of "rag") and in slang to refer to a woman of loose virtue (used as skank, slut etc might be used in English), a sense which transferred to colloquial use in sport to mean "simple shot", "easy goal" etc.

Designer distress: Lindsay Lohan illustrates the look.

Tatterdemalion is certainly a spectrum condition (the comparative “more tatterdemalion”; the superlative “most tatterdemalion”) and this is well illustrated by the adoption of the concept by fashionistas, modern capitalism soon there to supply demand.  In the fashion business, tatterdemalion needs to walk a fine line because tattiness was historically associated with poverty while designers need to provide garments which convey a message wealth.  The general terms for such garments is “distressed” although “destroyed” is (rather misleadingly) also used.

Highly qualified porn star Busty Buffy (b 1996) in “cut-off” denim shorts with leather braces while beltless.

The ancestor of designer tatterdemalion was a pair of “cut off” denim shorts, improvised not as a fashion statement but as a form of economy, gaining a little more life from a pair of jeans which had deteriorated beyond the point where mending was viable.  Until the counter-culture movements of the 1960s (which really began the previous decade but didn’t until the 1960s assume an expression in mass-market fashion trends), wearing cut-off jeans or clothing obviously patched and repaired generally was a marker of poverty although common in rural areas and among the industrial working class where it was just part of life.  It was only in the 1960s when an anti-consumerist, anti materialist vibe attracted the large cohort of youth created by the post-war “baby boom” that obviously frayed or torn clothing came to be an expression of disregard or even disdain for the prevailing standards of neatness (although paradoxically they were the richest “young generation” ever).  It was the punk movement in the 1970s which took this to whatever extremes seemed possible, the distinctive look of garments with rips and tears secured with safety pins so emblematic of (often confected) rebellion that in certain circles it remains to this day part of the “uniform”.  The fashion industry of course noted the trend and what would later be called “distressed” denim appeared in the lines of many mainstream manufacturers as early as the 1980s, often paired with the acid-washing and stone-washing which previously had been used to make a pair of jeans appear “older”, sometimes a desired look.

Dolce & Gabbana Distressed Jeans (part number FTCGGDG8ET8S9001), US$1150.

That it started with denim makes sense because it's the ultimate "classless" fabric in that it's worn by both rich and poor and while that has advantages for manufacturers, it does mean some are compelled to find ways to ensure buyers are able (blatantly or with some subtlety) to advertise what they are wearing is expensive; while no fashion house seems yet to have put the RRP (recommended retail price) on a leather patch, it may be only a matter of time.  The marketing of jeans which even when new gave the appearance of having been “broken in” by the wearer was by the 1970s a define niche, the quasi-vintage look of “fade & age” achieved with processes such as stone washing, enzyme washing, acid washing, sandblasting, emerizing and micro-sanding but this was just to create an effect, the fabrics not ripped or torn.  Distressed jeans represented the next step in the normal process of wear, fraying hems and seams, irregular fading and rips & tears now part of the aesthetic.  As an industrial process that’s not difficult to do but if done in the wrong way it won’t resemble exactly a pair of jeans subject to gradual degradation because different legs would have worn the denim at different places.  In the 2010s, the look spread to T-shirts and (predictably) hoodies, some manufacturers going beyond mere verisimilitude to (sort of) genuine authenticity, achieving the desired decorative by shooting shirts with bullets, managing a look which presumably the usual tricks of “nibbling & slashing” couldn’t quite emulate.  Warming to the idea, the Japanese label Zoo released jeans made from material torn by lions and tigers, the company anxious to mention the big cats in Tokyo Zoo seemed to "enjoy the fun" and to anyone who has seen a kitten with a skein of wool, that will sound plausible.  Others emulated the working-class look, the “caked-on muddy coating and “oil and grease smears” another variant although one apparently short-lived; appearing dirty apparently never a fashionable choice.  All these looks had of course been seen for centuries, worn mostly by the poor with little choice but to eke a little more wear from their shabby clothes but in the late twentieth century, as wealth overtook Western society, the look was adopted by many with disposable income; firstly the bohemians, hippies and other anti-materialists before the punk movement which needed motifs with some capacity to shock, something harder to achieve than had once been the case.

Distressed top and bottom.  Gigi Hadid (b 1995) in distressed T-shirt and "boyfriend" jeans.

For poets and punks, improvising the look from the stocks of thrift shops, that was fine but for designer labels selling scruffy-looking jeans for four-figure sums, it was more of a challenge, especially as the social media generation had discovered that above all they liked authenticity and faux authenticity would not do, nobody wanting to look it to look they were trying too hard.  The might have seemed a problem, given the look was inherently fake but the aesthetic didn’t matter for its own sake, all that had to be denoted was “conspicuous consumption” (the excessive spending on wasteful goods as proof of wealth) and the juxtaposition of thousand dollar distressed jeans with the odd expensive accessory, achieved that and more, the discontinuities offering irony as a look.  The labels, the prominence of which remained a focus was enough for the message to work although one does wonder if any of the majors have been tempted to print a QR code on the back pocket, linked to the RRP because, what people are really trying to say is “My jeans cost US$1200”.

1962 AC Shelby American Cobra (CSX 2000), interior detail, 2016.

The value of selective scruffiness is well known in other fields.  When selling a car, usually a tatty interior greatly will depress the price (sometimes by more even than the cost of rectification).  However, if the tattiness is of some historic significance, it can add to car’s value, the best example being if the deterioration is part of a vehicle's provenance and proof of originality, a prized attribute to the segment of the collector market known as the “originally police”.  In 2016, what is recognized as the very first Shelby American AC Cobra (CSX 2000) sold for US$13.75 million, becoming the highest price realized at auction for what is classified as "American car".  Built in 1962, it was an AC Ace shipped to California without an engine (and apparently not AC's original "proof-of-concept" test bed which was fitted with one of the short-lived 221 cubic inch (3.6 litre) versions of Ford's new "thin-wall" Windsor V8) where the Shelby operation installed a 260 cubic inch (4.2 litre) Windsor and the rest is history.  The tatterdemalion state of the interior was advertised as one of the features of the car, confirming its status as “an untouched survivor”.  Among Cobra collectors, patina caused by Carroll Shelby's (1923–2012) butt is a most valuable tatterdemalion.

Patina plus and beyond buffing out: Juan Manuel Fangio, Mercedes-Benz W196R Stromlinienwagen (Streamliner), British Grand Prix, Silverstone, 17 July 1954.

Also recommended to be repaired before sale are dents, anything battered unlikely to attract a premium.  However, if a dent was put there by a Formula One (F1) world champion, it becomes a historic artefact.  In 1954, Mercedes-Benz astounded all when their new grand prix car (the W196R) appeared with all-enveloping bodywork, allowed because of a since closed loophole in the rule-book.  The sensuous shape made the rest of the field look antiquated although underneath it was a curious mix of old and new, the fuel-injection and desmodromic valve train representing cutting edge technology while the swing axles and drum brakes spoke to the past and present, the engineers’ beloved straight-eight configuration (its last appearance in F1) definitely the end of an era.  On fast tracks like Monza, the aerodynamic bodywork delivered great speed and stability but the limitations were exposed when the team ran the Stromlinienwagen at tighter circuits and in the 1954 British Grand Prix at Silverstone, Juan Manuel Fangio (1911–1995; winner of five F1 world-championship driver's titles) managed to clout a couple of oil-drums (those and bails of hay how track safety was then done) because it was so much harder to determine the extremities without being able to see the front wheels.  Quickly, the factory concocted a functional (though visually unremarkable) open-wheel version and the sleek original was thereafter used only on the circuits where the highest speeds were achieved.  In 1954, the factory was unconcerned with the historic potential of the dents and repaired the tatterdemalion W196R so an artefact of the era was lost.  That apart, as used cars the W196s have held their value well, an open-wheel version selling at auction in 2013 for US$29.7 million while in 2025 a Stromlinienwagen realized US$53.9 million.  

1966 Ferrari 330 GTC (1966-1968) restored by Bell Sport & Classic.  Many restored Ferraris of the pre-1973 era are finished to a much higher standard than when they left the showroom.  Despite this, genuine, original "survivors" (warts and all) are much-sought in some circles.

In the collector car industry, tatterdemalion definitely is a spectrum condition and for decades the matter of patina versus perfection has been debated.  There was once the idea that in Europe the preference was for a vehicle to appear naturally aged (well-maintained but showing the wear of decades of use) while the US market leaned towards cars restored to the point of being as good (or better) than they were on the showroom floor.  Social anthropologists might have some fun exploring that perception of difference and it was certainly never a universal rule but the debate continues, as does the argument about “improving” on the original.  Some of the most fancied machinery of the 1950s and 1960s (notably Jaguars, Ferraris and Maseratis) is now a staple of the restoration business but, although when new the machines looked gorgeous, it wasn’t necessary to dig too deep to find often shoddy standards of finish, the practice at the time something like sweeping the dirt “under the rug”.  When "restored", many of these cars are re-built to a higher standard, what was often left rough because it sat unseen somewhere now smoothed to perfection.  That’s what some customers want and the best restoration shops can do either though there are questions about whether what might be described as “fake patina” is quite the done thing.  Mechanics and engineers who were part of building Ferraris in the 1960s, upon looking at some immaculately “restored” cars have been known wryly to remark: that wasn't how we built them then.” 

Gucci offered Distressed Tights at US$190 (for a pair so quite good value).  Rapidly, they sold-out.

The fake patina business however goes back quite a way.  Among antique dealers, it’s now a definite niche but from the point at which the industrial revolution began to create a new moneyed class of mine and factory owners, there was a subset of the new money (and there are cynics who suggest it was mostly at the prodding of their wives) who wished to seem more like old money and a trend began to seek out “aged” furniture with which a man might deck out his (newly acquired) house to look as if things had been in the family for generations.  The notoriously snobbish (and amusing) diarist Alan Clark (1928–1999) once referred to someone as looking like “they had to buy their own chairs”, prompting one aristocrat to respond: “That’s a bit much from someone whose father (the art historian and life peer Kenneth Clark (1903–1983)) had to buy his own castle.  The old money were of course snooty about the such folk and David Lloyd George (1863–1945; UK prime-minister 1916-1922) would lament many of the “jumped-up grocers” in his Liberal Party were more troublesome and less sympathetic to the troubles of the downtrodden than the "backwoodsmen" gentry in their inherited country houses.

Monday, May 12, 2025

Sunroof

Sunroof (pronounced suhn-roof)

(1) A section of an automobile roof (sometimes translucent and historically called a moonroof) which can be slid or lifted open.

(2) In obstetrics, a slang term used by surgeons to describe the Caesarean section.

1952: A compound word, the construct being sun + roof.  Sun was from the Middle English sonne & sunne, from the Old English sunne, from the Proto-West Germanic sunnā, from the Proto-Germanic sunnǭ, from the primitive Indo-European shwen-, oblique of sóhw (sun).  The other forms from the Germanic included the Saterland Frisian Sunne, the West Frisian sinne, the German Low German Sünn, the Dutch zon, the German Sonne and the Icelandic sunna.  The forms which emerged without Germanic influence included the Welsh huan, the Sanskrit स्वर् (svar) and the Avestan xᵛə̄ṇg.  The related forms were sol, Sol, Surya and Helios.  Roof was from the Middle English rof, from the Old English hrōf (roof, ceiling; top, summit; heaven, sky), from the Proto-Germanic hrōfą (roof).  Throughout the English-speaking world, roofs is now the standard plural form of roof.  Rooves does have some history but has long been thought archaic and the idea there would be something to be gained from maintaining rooves as the plural to avoid confusion with roof’s the possessive never received much support.  Despite all that, rooves does seem to appear more than might be expected, presumably because there’s much more tolerance extended to the irregular plural hooves but the lexicographers are unimpressed and insist the model to follow is poof (an onomatopoeia describing a very small explosion, accompanied usually by a puff of smoke), more than one poof correctly being “poofs”.  In use, a poof was understood as a small event but that's obviously a spectrum and some poofs would have been larger than others so it would have been a matter of judgement when something ceased to be a “big poof” and was classed an explosion proper.  Sunroof is a noun (sometimes hyphenated); the noun plural is sunroofs.

1973 Lincoln Continental Mark IV with moonroof.

Sunroofs existed long before 1952 but that was the year the word seems first to have been adopted by manufacturers in Detroit.  The early sunroofs were folding fabric but metal units, increasingly electrically operated, were more prevalent by the early 1970s.  Ford, in 1973, introduced the word moonroof (which was used also as moon roof & moon-roof) to describe the sliding pane of one-way glass mounted in the roof panel over the passenger compartment of the Lincoln Continental Mark IV (1972-1976).  Moonroof soon came to describe any translucent roof panel, fixed or sliding though the term faded from use and all such things tend now to be thought sunroofs.

Open (left) and shut (centre) case: 1976 Lincoln Continental Mark IV (right) with Moonroof.

According to Ford in 1973, a “sunroof” was an opening in the roof with a sliding hatch made from a non-translucent material (metal or vinyl) while a “moonroof” included a hatch made from a transparent or semi-transparent substance (typically then glass).  The advantage the moonroof offered was additional natural light could be enjoyed even if the weather (rain, temperature etc) precluded opening the hatch.  A secondary, internal, sliding hatch (really an extension of the roof lining) enabled the sun to be blocked out if desired and in that configuration the cabin’s ambiance would be the same whether equipped with sunroof, moonroof or no sliding mechanism of any kind.  Advances in materials mean many of what now commonly are called “sunroofs” are (by Ford’s 1973 definition) really moonroofs but use of the latter term is now rare.

Lindsay Lohan standing through a sunroof: Promotional photo-shoot for Herbie Fully Loaded (2005).

Unlike many manufacturers, for many years Volkswagen maintained specific “Sunroof” models in the Beetle (Type 1) range.  When in 1945 the British military occupation forces assumed control of the Volkswagen factory and commenced production of civilian models (those made since 1938 delivered almost exclusively to the German armed forces or Nazi Party functionaries), one of the first organizational changes was to replace Herr Professor Ferdinand Porsche’s (1875–1951) internal type designations with a new set and these included the 115 (Standard Beetle Sunroof Sedan (LHD (left-hand drive)), 116 (Standard Beetle Sunroof Sedan (RHD (right-hand drive)), 117 (Export Deluxe Beetle Sunroof Sedan (LHD) & Export Deluxe Beetle Sunroof Sedan (RHD).  The original sunroof was a folding, fabric apparatus and this remained in use until 1963, a steel, sliding (manually hand-cranked) unit was fitted after the release of the 1964 range.  The Beetle used in the original film (The Love Bug (1968)) was a 1963 Sunroof Beetle; at the time they were readily available at low cost but by 2004-2005 when Herbie: Fully Loaded was in production, they were less numerous and some of those used in the filming were actually 1961 models modified (to the extent required in movies) for purposes of continuity.  Interestingly, the one which appears in most scenes appears to be a 1964 model which implies the folding sunroof was at some point added, not difficult because the kits have long been available.

Caesarean section post-operative scar: C-section scar revision is now a commonly performed procedure.

Manufacturers in the 1970s allocated resources to refine the sunroof because, at the time, the industry’s assumption was the implications of the US NHTSA's (National Highway Traffic Safety Administration) FMVSS (Federal Motor Vehicle Safety Standards) 208 (roll-over protection, published 1970) fully would be realized, outlawing both convertibles and hardtops (certainly the four-door versions).  FMVSS 208 was slated to take effect in late 1975 (when production began of passenger vehicles for the 1976 season) with FMVSS 216 (roof-crush standards) added in 1971 and applying to 1974-onwards models.  There was a “transitional” exemption for convertibles but it ran only until August 1977 (a date agreed with the industry because by then Detroit’s existing convertible lines were scheduled to have reached their EoL (end of life)) at which point the roll-over and roof-crush standards universally would be applied to passenger vehicles meaning the only way a “convertible” could registered for use on public roads was if it was some interpretation of the “targa” concept (Porsche 911, Chevrolet Corvette etc), included what was, in effect a roll-cage (Triumph Stag) or (then more speculatively), some sort of device which in the event of a roll-over would automatically be activated to afford occupants the mandated level of protection and Mercedes-Benz later would include such a device on the R129 SL roadster (1998-2001).  Although in 1988 there were not yet “pop-ups” on the internet to annoy us, quickly the press dubbed the R129’s innovative safety feature a “pop-up roll bar”, the factory called the apparatus automatischer Überrollbügel (automatic rollover bar).  It was spring loaded and pyrotechnically activated, designed fully to deploy in less than a half-second if sensors detected an impending rollover although the safety-conscious could at any time raise it by pressing one of the R129’s many buttons.


Alternative approaches (partial toplessness): 1973 Triumph Stag in Magenta (left) and 1972 Porsche 911 Targa in silver (right).  The lovely but flawed Stag (1970-1977) actually needed its built-in roll cage for structural rigidity because it's underpinnings substantially were unchanged from the Triumph 2000 sedan (1963-1977) on which it was based.

Despite the myths which grew to surround the temporary extinction of convertibles from Detroit’s production lines, at the time, the industry was at best indifferent about their demise and happily would have offered immediately to kill the breed as a trade-off for a relaxation or abandonment of other looming safety standards.  As motoring conditions changed and the cost of installing air-conditioning fell, convertible sales had since the mid-1960s been in decline and the availability of the style had been pruned from many lines.  Because of the additional engineering required (strengthening the platform, elaborate folding roofs with electric motors), keeping them in the range was justifiable only if volumes were high and it was obvious to all the trend was downwards, thus the industry being sanguine about the species loss.  That attitude didn’t however extend to a number of British and European manufacturers which had since the early post-war years found the US market a place both receptive and lucrative for their roadsters and cabriolets; for some, their presence in the US was sustained only by drop-top sales.  By the 1970s, the very existence of the charming (if antiquated) MG & Triumph roadsters was predicated upon US sales.


High tech approach (prophylactic toplessness): Mercedes-Benz advertising for the R129 roadster (in the factory's Sicherheitsorange (safety orange) used for test vehicles).

The play on words uses the German wunderbar (“wonderful” and pronounced vuhn-dah-baah) with a placement and context so an English speaking audience would read the word as “wonder bar”; it made for better advertising copy than the heading: Automatischer Überrollbügel.  This was a time when the corporate tag-line Engineered like no other car” was still a reasonable assertion.  It had been the spectre of US legislation which accounted for Mercedes-Benz not including a cabriolet when the S-Class (W116) was released in 1972, leaving the SL (R107; 1971-1989) roadster as the company’s only open car and it wasn’t until 1990 a four-seat cabriolet returned with the debut of the A124. 

Chrysler was already in the courts to attempt to have a number of the upcoming regulations (focusing on those for which compliance would be most costly, particularly barrier crash and passive safety requirements) so instead of filing their own suit, a consortium of foreign manufacturers (including British Leyland & Fiat) sought to “append themselves” to the case, lodging a petition seeking judicial review of roll-over and roof-crush standards, arguing that in their present form (ie FMVSS 208 & 216), their application unfairly would render unlawful the convertible category (on which the profitability of their US operation depended).  A federal appeals court late in 1972 agreed and referred the matter back to NHTSA for revision, ordering the agency to ensure the standard “…does not in fact serve to eliminate convertibles and sports cars from the United States new car market. The court’s edit was the basis for the NHTSA making convertibles permanently exempt from roll-over & roof crush regulations.  That ensured the foreign roadsters & cabriolets lived on but although the ruling would have enabled Detroit to remain in the market, it regarded the segment as one in apparently terminal decline and had no interest in allocating resources to develop new models, happily letting existing lines expire.

The “last American convertible” ceremony, Cadillac Clark Street Assembly Plant, Detroit, Michigan, 21 April 1976.

One potential “special case” may have been the Cadillac Eldorado which by 1975 was the only one of the few big US convertibles still available selling in reasonable numbers but the platform was in its final years and with no guarantee a version based on the new, smaller Eldorado (to debut in 1978) would enjoy similar success, General Motors (GM) decided it wasn’t worth the trouble but, sensing a “market opportunity”, promoted the 1976 model as the “Last American convertible”.  Sales spiked, some to buyers who purchased the things as investments, assuming in years to come they’d have a collectable and book a tidy profit on-selling to those who wanted a (no longer available) big drop-top.  Not only did GM use the phrase as a marketing hook; when the last of the 1976 run rolled off the Detroit production line on 21 April, the PR department, having recognized a photo opportunity, conducted a ceremony, complete with a “THE END OF AN ERA 1916-1976”) banner and a “LAST” Michigan license plate.  The final 200 Fleetwood Eldorado convertibles were “white on white on white”, identically finished in white with white soft-tops, white leather seat trim with red piping, white wheel covers, red carpeting & a red instrument panel; red and blue hood (bonnet) accent stripes marked the nation’s bicentennial year.

The “last American convertible” ceremony, Cadillac Clark Street Assembly Plant, Detroit, Michigan, 21 April 1976.

Of course in 1984 a convertible returned to the Cadillac catalogue so some of those who had stashed away their 1976 models under wraps in climate controlled garages weren’t best pleased and litigation ensued, a class action filed against GM alleging the use of the (now clearly incorrect) phrase “Last American Convertible” had been “deceptive or misleading” in that it induced the plaintiffs to enter a contract which they’d not otherwise have undertaken.  The suit was dismissed on the basis of there being insufficient legal grounds to support the claim, the court ruling the phrase was a “non-actionable opinion” rather than a “factual claim”, supporting GM's contention it had been a creative expression rather than a strict statement of fact and thus did not fulfil the criteria for a “deceptive advertising” violation.  Additionally, the court found there was no actual harm caused to the class of plaintiffs as they failed to show they had suffered economic loss or that the advertisement had led them to make a purchase they would not otherwise have made.  That aspect of the judgment has since been criticized with dark hints it was one of those “what’s good for General Motors is good for the country” moments but the documentary evidence did suggest GM at the time genuinely believed the statement to be true and no action was possible against the government on several grounds, including the doctrines of remoteness and unforeseeability.

Ronald Reagan (1911-2004, US president 1981-1989). in riding boots & spurs with 1938 LaSalle Series 50 Convertible Coupe (one of 819 produced that year), Warner Brothers Studios, Burbank California, 1941.  

LaSalle was the lower-priced (although marketed more as "sporty") "companion marque" to Cadillac and a survivor of GM's (Great Depression-induced) 1931 cull of brand-names, the last LaSalle produced in 1940.  Mr Regan remained fond of Cadillacs and when president was instrumental is shifting the White House's presidential fleet to them from Lincolns.  Although doubtlessly Mr Reagan had fond memories of top-down motoring in sunny California (climate change not yet making things too hot, too often for them to be enjoyed in summer) and was a champion (for better and worse) of de-regulation, it's an urban myth he lobbied to ensure convertibles weren't banned in the US.  

Following Lindsay Lohan's example: President Xi standing through a sunroof, reviewing military parade in Hongqi L5 state limousine, Beijing, 2019.

The highlight of the ceremonies marking the 70th anniversary of the founding of the People's Republic of China (PRC) was the military parade, held in Beijing on 1 October 2019.  Claimed to be the largest military parade and mass pageant in China's 4,000-odd year history (and the last mass gathering in China prior to the outbreak in Wuhan of became the COVID-19 pandemic), the formations were reviewed by the ruling Chinese Communist Party’s (CCP) General Secretary Xi Jinping (b 1953; paramount leader of the PRC since 2012).  The assembled crowd was said without exception to be “enthusiastic and happy” and the general secretary's conspicuously well-cut Mao suit was a nice nostalgic touch.

Two generals of the Belarus army take the salute standing, in Honggi L5 Parade Convertibles, Minsk, Belarus, June 2017.

Independence Day in Belarus is celebrated annually on 3 June and there is always a significant military component.  Other than the PRC, Belarus is the only known operator of the Honqqi and the four-door convertible parade cars were apparently a "gift" (as opposed to foreign aid) from the Chinese government but the aspect of this photograph which attracted some comment was whether the hats worn by generals in Belarus were bigger than the famously imposing headwear of the army of the DPRK (Democratic People’s Republic of Korea (North Korea)); analysts of military millinery appeared to conclude the dimensions were similar.  Purists traditionally describe this style of coach-work as "four-door cabriolet" and it was "Cabriolet D" in the Daimler-Benz system but the "parade convertible" is a distinct breed and often includes features such as grab bars for those standing, microphones and loud-speakers so the “enthusiastic and happy” crowd miss not one word.   

Hongqi L5 state limousine.

The car carrying President Xi was the Hongqi L5, the state limousine of the PRC, the coachwork styling a deliberately retro homage to the Hongqi CA770, the last in the line (dating from 1958) of large cars built almost exclusively for use by the upper echelons of the CCP.  Most of the earlier cars were built on the large platforms US manufacturers used in the 1960s and were powered by a variety of US-sourced V8 engines but the L5 was wholly an indigenous product, built with both a 6.0 litre (365 cubic inch) V12 and 4.0 litre (245 cubic inch) V8 although neither configuration is intended for high-performance.  Interestingly, although Hongqi L5 have produced a version of the L5 with four-door convertible coachwork as a formal parade car and they have been used both in the PRC and in Belarus, the general secretary conducted his review in a closed vehicle with a sunroof.

US President Richard Nixon (1913-1994, US president 1969-1974) with Anwar Sadat (1918–1981; President of Egypt 1970-1981) in a 1967 Cadillac convertible, Alexandria, Egypt, June 1974.  On that day, the motorcade was 180-strong and unlike the reception his appearance in the US now induced, the Egyptian crowd really did seem genuinely enthusiastic and happy.  Within two months, in disgrace because of his part in the Watergate Affair, Nixon would resign.

The CCP didn’t comment on the choice of a car with a sunroof and it may have been made on technical grounds, the provision of a microphone array presumably easier with the roof available as a mounting point and given the motorcade travelled a higher speed than a traditional parade, it would also have provided a more stable platform for the general secretary.  It’s not thought there was any concern about security, Xi Jinping (for a variety of reasons) safer in his capital than many leaders although heads of state and government became notably more reticent about travelling in open-topped vehicles after John Kennedy (1917–1963; US president 1961-1963) was assassinated in 1963.  Some, perhaps encouraged by Richard Nixon being greeted by cheering crowds in 1974 when driven through the streets of Alexandria (a potent reminder of how things have changed) in a Cadillac convertible, persisted but after the attempt on the life of John Paul II (1920–2005; pope 1978-2005) in 1981, there’s been a trend to roofs all the way, sometimes molded in translucent materials of increasing chemical complexity to afford some protection from assassins.

Military parade marking the 70th anniversary of the founding of the PRC, Beijing, China, 1 October 2019.  Great set-piece military parades like those conducted by the PRC and DPRK (recalling the spectacles staged by both Nazi Germany (1933-1945) and the Soviet Union (1922-1991) are now packaged for television and distribution on streaming platforms and it may be Donald Trump (b 1946; US president 2017-2021 and since 2025) was hoping the "Grand Military Parade" he scheduled in 2025 for his 79th birthday (ostensibly to celebrate 250 years since the formation of the US Army) would display the same impressive precision in chorography.     

Covering all possibilities during the 24 hour cycle.  US advertisement (1974) for the Renault 17 Gordini Coupe Convertible, the Gordini tag adopted as a "re-brand" of the top-of-the-range R17 (1971-1979).  Gordini was a French sports car producer and tuning house, absorbed by Renault in 1968, the name from time-to-time used for high-performance variants of various Renault models.

Renault over the decades made the occasional foray into the tempting US market but all ended badly in one way or another, their products, whatever their sometimes real virtues, tending not to be suited to US driving habits and conditions.  Sunroofs had long been popular in Europe and, noting (1) what was assumed to be the demise of the convertible and (2) Lincoln's coining of "moon roof", Renault decided Americans deserved a sunroof, moonroof & starroof, all in one.  Actually, they got even more because there was also a removable, fibreglass hardtop for the winter months, Renault correctly concluding there would be little demand for a rainroof.  Physically large as it had to be, unlike a targa top, the 17's panel was intended (like other hardtops) to be stored in a garage until the warmer months.  One quirk of the R17's nomenclature was in Italy, in deference to the national heptadecaphobia, the car was sold as the R177 but the Italians showed little more interest than the Americans.

Porsche, sunroofs, weight distribution and centres of gravity 

Porsche 917K, Le Mans, 1970.

Porsche in the early 1970s enjoyed great success in sports car racing with their extraordinary 917 but greatly innovation and speed disturb the clipboard-carriers at the Fédération Internationale de l'Automobile (the FIA; the International Automobile Federation) which is international sport's dopiest regulatory body.  Inclined instinctively to ban anything interesting, the FIA outlawed the 917 in sports car racing so Porsche turned its glance to the Can-Am (Canadian-American Challenge Cup) for unlimited displacement (Group 7) sports cars, then dominated by the McLarens powered by big-displacement Chevrolet V8s.  Unable to enlarge the 917's Flat-12 to match the power of the V8s and finding their prototype Flat-16 too bulky, Porsche resorted to forced aspiration and created what came to be known as the "TurboPanzer", a 917 which in qualifying trim took to the tracks with some 1,500 horsepower (HP).  There's since been nothing quite like it and for two years it dominated the Can-Am until the first oil shock in 1973 put an end to the fun.  However, the lessons learned about turbocharging the factory would soon put to good use.

The widow-maker: 1979 Porsche 930 Turbo (RoW (rest of the world (ie Non-NA (North American) market) model) in the “so 1980s” Guards Red with “Sunroof Delete” option.

Although an RoW car, this one has been "federalized" for registration in the US including the then required sealed-beam headlights, fitted inside the "sugar-scoop" housings.  Curiously, although the term “sunroof delete option” is often applied to the relative few 930s with solid metal roofs, there was at the time no such 930 option code and, the sunroof being listed as “standard equipment” on 930s, if a customer requested one not be fitted, what the factory did was not include option 9474 (electric sunroof) on the build sheet.  Later the companion option codes 650 (Sunroof) and 652 (Delete Sunroof) became part of the list for all models.  Rare though it may be in some Porsches, for some the “sunroof delete” thing is surprisingly desirable and in the aftermarket, it's possible to purchase “sunroof delete” panels which convert a sunroof-equipped car into one with a solid metal roof.  They are bought usually by those converting road-going cars for track use, the removal of the 29 lb-odd (13 kg) assembly not only saving weight but also lowering the centre of gravity.

1977 Porsche 930 “Sunroof Coupé” in Talbot Yellow.

Introduced in 1975, the 911 Turbo (930 the internal designation) had been intended purely as a homologation exercise (al la the earlier 911 RS Carrera) so the engine could be used in competition but so popular did it prove it was added to the list as a regular production model and one has been a permanent part of the catalogue almost continuously since.  The additional power and its sometimes sudden arrival meant the early versions were famously twitchy at the limit (and such was the power those limits were easily reached if not long explored), gaining the machine the nickname “widow-maker”.  There was plenty of advice available for drivers, the most useful probably the instruction not to use the same technique when cornering as one might in a front-engined car and a caution that even if one had had a Volkswagen Beetle while a student, that experience might not be enough to prepare one for a Porsche Turbo.  When stresses are extreme, the physics mean the location of small amounts of weight become subject to a multiplier-effect and the advice was those wishing to explore a 930's limits of adhesion should get one with the rare “sunroof delete” option, the lack of the additional weight up there slightly lowering the centre of gravity.  However, even that precaution may only have delayed the inevitable and possibly made the consequences worse, one travelling a little faster before the tail-heavy beast misbehaved.

Porsche 911 Carrera S, Pacific Coast Highway in Santa Monica, Los Angeles, June 2012.

Although it seems improbable, when in 2012 Lindsay Lohan crashed a sunroof-equipped Porsche 911 Carrera, it's not impossible the unfortunate event may have been related to the slight change in the car's centre of gravity when fitted with a sunroof.  She anyway had some bad luck when driving black German cars but clearly Ms Lohan should avoid Porsches with sunroofs.

The interaction of the weight of a 911’s roof (and thus the centre of gravity) and the rearward bias of the weight distribution was not a thing of urban myth or computer simulations.  In the February 1972 edition of the US magazine Car and Driver (C&D), a comparison test was run of the three flavours of the revised 911 (911T, 911E & 911S), using one of each of the available bodies: coupé, targa & sunroof coupé, the latter with the most additional weight in the roof.  What the testers noted in the targa & sunroof-equipped 911s was a greater tendency to twitchiness in corners, something no doubt exacerbated in the sunroof coupé because the sliding panel’s electric motor was installed in the engine bay.  C&D’s conclusion was: “If handling is your goal, it's best to stick with the plain coupe.”  

The Porsche 911 E series and the Ölklappe affair

1971 Porsche 911S (note the flap for the oil filler cap behind the passenger-side door (US market model and thus left-hand drive (LHD)).  The factory confirmed this car was built in July 1971, despite many references to E series production beginning in August.

Although in C&D's 1972 comparison test there was much focus on the rearward weight bias, the three 911s supplied actually had a slightly less tail-heavy weight distribution than either that season's predecessor or successor.  Porsche in 1971 began the build of its E series update (produced between July 1971-July 1972 and generally known as the “1972 models”) of the then almost decade-old 911 and in addition to the increase in the flat-six’s displacement from 2.2 litres (134 cubic inch) to 2.3 (143) (although always referred to as the “2.4”), there were a myriad of changes, some in response to US safety & emissions legislation while others were part of normal product development.  One of latter was the placing of the hinged-flap over the oil filler cap behind the right side door, something necessitated by the dry sump oil tank having been re-located from behind the right rear wheel to in front, one of a number of design changes undertaken to shift the weight distribution forward and improve the handling of the rear-engined machine’s inherently tail-heavy configuration.  In Germany, the addition was known variously as Ölklappe, Oil Klapper or Vierte Tür (fourth door, the fuel filler flap being the third).  Weight reduction (then becoming difficult in the increasingly strict regulatory environment), especially at the rear, was also a design imperative and the early-build E series cars were fitted with an aluminum engine lid and license-plate panel although these components were soon switched to steel because of production difficulties and durability concerns.

Where the troubles began:  The fuel filler flap on the left-front fender (left) and the oil filler flap on the right-rear fender (right).  Apparently, not even the “◀ Oil” sticker in red was sufficient warning.

For the E series 911s, Porsche recommended the use of a multigrade mineral oil (SAE 20W-50 or SAE 15W-40, depending on climate) but were aware those using their vehicles in competition sometimes used a high-viscosity SAE 50 monograde.  With the car’s 10 litre (10.6 US quarts, 8.8 Imperial quarts) oil tank, the fluid’s weight would be between 8.5-9.1 kg (18.7-20.0 lb) and the physics of motion meant that the more rearward the placement of that mass, the greater the effect on the 911’s handling characteristics.  It was thus a useful contribution to what would prove a decades-long quest to tame the behaviour of what, in the early versions, was a car regarded (not wholly unfairly) as handling like “a very fast Volkswagen Beetle” and ultimately the engineers succeeded, it being only at the speeds which should be restricted to race tracks the 911s of the 2020s sometimes reveal the implications of being rear-engined.

VDO instruments in 1971 Porsche 911S.  In home market cars, the oil pressure gauge (to the left of the centrally mounted tachometer) was labelled DRUCK.

However, when in August 1972 the revised F series entered production, the oil tank was back behind the rear wheel and the filler under the engine lid, the retrogressive move taken because there had been instances of gas (petrol) station attendants (they really used to exist) assuming the oil filler flap was the access point for the gas cap and, to be fair, it was in a location used for gas on many front-engined cars (a majority of the passenger-car fleet in most markets where Porsche had a presence).  Quite how often this happened isn’t known but it must have been frequent enough for the story to become part of the 911 legend and the consequences could have been severe and rectification expensive.  The factory paid much attention to oil and also ensured drivers could monitor the status of the critical fluid; all air-cooled 911s ran hot and the more highly tuned the model (in 1971-1972 the 911T, E & S in increasing potency), the hotter they got.  As well as being a lubricant, engine oil functions also as a coolant and the VDO instrumentation included gauges for oil level, oil temperature, and oil pressure; for all three to appear in a road car was unusual but being air-cooled and thus with no conventional fluid coolant, the oil's dynamics were most important.