Showing posts sorted by date for query Funeral. Sort by relevance Show all posts
Showing posts sorted by date for query Funeral. Sort by relevance Show all posts

Wednesday, July 16, 2025

Pavlova

Pavlova (pronounced pav-luh-vuh, pahv-loh-vuh, pav-luh-vuh or pah-vluh-vuh (Russian)).

A meringue cake, topped typically with whipped cream and fruit or confections.

Circa 1930: Named after Russian ballet ballerina Anna (pronounced ah-nuh) Pavlova (1885-1931).  Pavlova is a transliteration of the Russian surname Па́влова (Pávlova), the feminine variant of Па́влов (Pávlov).  Pavlova is a noun and Pavlovian is an adjective; the noun plural is pavlovas.  The standard short form (of the cake) is "pav" and if used as a proper noun, there's an initial capital.


Julia
from Pampered Menial (1975) by Pavlov’s Dog.

Although coined at much the same time, the adjective Pavlovian is unrelated to the Russian ballerina or meringue cakes.  It refers to the theories & experimental work of Russian physiologist Ivan Petrovich Pavlov (Ива́н Петро́вич Па́влов; 1849-1936), especially in connection with the conditioned salivary reflexes of dogs in response to the mental stimulus of the sound of a bell (in the West, his work was in 1911 originally referred to as the “Pavloff method” because of a misunderstanding by editors).  His work was a landmark in experimental behavioralism, inducing a dog associatively to link a biologically potent stimulus (food) with a previously neutral stimulus (a bell).  The phrase “Pavlov’s dog” entered English to describe a conditioned response (reacting to a situation on the basis of taught behavior rather than reflectively).  One interesting aspect of comrade Pavlov’s career is he made no secret of his opposition to many aspects of communism in the Soviet state built by comrade Stalin (1878–1953; leader of the USSR 1922-1953), on occasions making his views plain even to the general secretary himself.  Despite that, no action appears ever to have been taken against him and after he died (at 86 of natural causes), he was granted a grand funeral.

Anna Pavlova with Jack.

Anna Pavlova was famous for her interpretation of The Dying Swan, a solo dance choreographed by Mikhail Fokine (1880-1942) to Camille Saint-Saëns's (1835-1921) Le Cygne (The Swan) from Le Carnaval des animaux (The Carnival of the Animals (1922)), commissioned as a pièce d'occasion (an artistic work produced for a special event) for the ballerina who performed it on some 4000 occasions.  It's a short, intense piece which follows the last moments of a swan and for years Ms Pavlova kept a pet swan called Jack.  That she lent her name to a light, meringue-based dessert with a crisp crust and soft, marshmallowy centre was a consequence of the impression she made on tours of Australia & New Zealand during the 1920s.  Such was her elegance, lightness, and grace on stage, the meringue’s airy texture was seen as the culinary expression of her ethereal dancing style, chefs seeking to create something which was at once a thing of swirling style yet also ephemerally fragile.

Rendered by Vovsoft as cartoon character: Lindsay Lohan with a pavolva she'd just whipped up.

New Zealand is a small country in the remote South Pacific which has over the years produced some notable figures such as (1) Lord Rutherford (1871–1937) who, although a physicist who regarded other branches of science as mere applications of engineering which worked within the laws of physics, was awarded the 1908 Nobel Prize in chemistry and is most remembered for his work which led to the atom being split in 1932, (2) Sir Edmund Hillary (1919–2008) who, with the Sherpa mountaineer Tenzing Norgay (1914–1986), was the first to ascend Mount Everest and (3) Sir David Low (1891–1963) who was among the most noted and prolific political cartoonists between the troubled 1930s and the early Cold War years.  The country has also for more than a century fielded what has been usually the world’s most successful rugby union side (the recent inconsistency of the All Blacks not withstanding) and memories are long, the try disallowed by a Scottish referee in a 1905 test against Wales at Cardiff Arms Park (Wales 3, All Blacks 0) still a sore point.

Mango, passion fruit & limoncello pavlova.

Less bitter but no less contested than the matter of the disallowed try is the origin of the pavlova, the invention of which is claimed by both Australia and New Zealand.  What all agree is the cake is a mixture of egg whites and sugar, topped usually with cream and fresh fruit, named after the Russian ballerina Anna Pavlova who toured both countries during the 1920s.  Researchers on both sides of the Tasman Sea (referred to by locals as “the ditch”) have long trawled cook books and newspapers to find the earliest entry but according to the Oxford English Dictionary (OED), New Zealand appears to hold the evidential advantage, a recipe from there having been verified as published in 1927 while the oldest claimed entry from Australia dates from 1935.  That however resolves only the use of Ms Pavlova’s name as the description, pastry chefs adding cream to meringue known even in the nineteenth century and the 1927 recipe in the book Davis Dainty Dishes, published by the Davis Gelatine company, was a multi-colored jelly concoction.  New Zealand’s historians of food concede the culinary point but cite recipes from 1928 & 1929 which are definitely of meringue, cream and fruit.  Strangely perhaps, the OED remained on the lexicographical fence, listing the origin as an ambiguous "Austral. and N.Z."

Espresso martini pavlova

Preparation: 1 hour

Cooking: 2 hours:

Serves: 10-12

Ingredients

8 egg whites
Pinch of cream of tartar
1 tablespoon ground coffee powder
430 gm (2 cups) caster sugar
2 tablespoons of corn-flour
1 teaspoon white vinegar
600 ml (l carton) thickened cream
125 ml (½ cup) coffee liqueur
2 teaspoons cocoa powder
Chocolate-coated coffee beans (to decorate)
Dark chocolate curls (to decorate)
Coffee vodka syrup
2 tablespoons vodka
2 teaspoons arrowroot
100 grams (½ cup, firmly packed) brown sugar
125 ml (½ cup) prepared espresso coffee

Instructions

(1) Preheat oven to 120C (100C fan forced) (250F (210F fan forced).  Draw a 200 mm (8 inch) circle on 2 sheets of baking paper.  Place each sheet, marked side down, on a baking tray.

(2) Use an electric beater with a whisk attachment to whisk the egg whites and cream of tartar in a clean dry bowl until firm peaks form.  Gradually whisk in the coffee powder.  Add the sugar, 1 tablespoon at a time, whisking constantly until the sugar dissolves and the mixture is thick and glossy.  Beat in the corn-flour and vinegar.

(3) Divide meringue mixture among the 2 marked circles on the prepared trays. Use a palette knife to spread mixture into 2 evenly shaped discs.  Bake for 2 hours or until meringues are dry and crisp.  Turn off oven. Leave meringues in the oven, with the door slightly ajar, until cooled completely.

(4) Meanwhile, to make the coffee vodka syrup, combine the vodka and arrowroot in a small bowl.  Combine the sugar and coffee in a small saucepan.  Bring to the boil over high heat, stirring, until the sugar dissolves. Reduce heat and simmer for 3 minutes or until the syrup has thickened slightly.  Stir in the vodka mixture and return to the boil, boiling for 1 minute or until thickened.  Remove from heat and transfer to a small bowl and set aside to cool.  Place in the fridge until required.

(5) Use electric beaters to beat the cream in a bowl until soft peaks form. Beat in the coffee liqueur and cocoa until firm peaks form.

(6) Place 1 pavlova disc on a serving plate. Top with half the cream mixture. Drizzle with a little coffee vodka syrup. Scatter with coffee beans and chocolate curls.  Repeat with the remaining disc, cream mixture, syrup, coffee beans and chocolate curls.  Serve.

A century-odd on, an issue still: Auckland Airport, New Zealand, December 2023.

Monday, June 30, 2025

Bunker

Bunker (pronounced buhng-ker)

(1) A large bin or receptacle; a fixed chest or box.

(2) In military use, historically a fortification set mostly below the surface of the ground with overhead protection provided by logs and earth or by concrete and fitted with above-ground embrasures through which guns may be fired.

(3) A fortification set mostly below the surface of the ground and used for a variety of purposes.

(4) In golf, an obstacle, classically a sand trap but sometimes a mound of dirt, constituting a hazard.

(5) In nautical use, to provide fuel for a vessel.

(6) In nautical use, to convey bulk cargo (except grain) from a vessel to an adjacent storehouse.

(7) In golf, to hit a ball into a bunker.

(8) To equip with or as if with bunkers.

(9) In military use, to place personnel or materiel in a bunker or bunkers (sometimes as “bunker down”).

1755–1760: From the Scottish bonkar (box, chest (also “seat” (in the sense of “bench”) of obscure origin but etymologists conclude the use related to furniture hints at a relationship with banker (bench).  Alternatively, it may be from a Scandinavian source such as the Old Swedish bunke (boards used to protect the cargo of a ship).  The meaning “receptacle for coal aboard a ship” was in use by at least 1839 (coal-burning steamships coming into general use in the 1820s).  The use to describe the obstacles on golf courses is documented from 1824 (probably from the extended sense “earthen seat” which dates from 1805) but perhaps surprisingly, the familiar sense from military use (dug-out fortification) seems not to have appeared before World War I (1914-1918) although the structures so described had for millennia existed.  “Bunkermate” was army slang for the individual with whom one shares a bunker while the now obsolete “bunkerman” (“bunkermen” the plural”) referred to someone (often the man in charge) who worked at an industrial coal storage bunker.  Bunker & bunkerage is a noun, bunkering is a noun & verb, bunkered is a verb and bunkerish, bunkeresque, bunkerless & bunkerlike are adjectives; the noun plural is bunkers.

Just as ships called “coalers” were used to transport coal to and from shore-based “coal stations”, it was “oilers” which took oil to storage tanks or out to sea to refuel ships (a common naval procedure) and these STS (ship-to-ship) transfers were called “bunkering” as the black stuff was pumped, bunker-to-bunker.  That the coal used by steamships was stored on-board in compartments called “coal bunkers” led ultimately to another derived term: “bunker oil”.  When in the late nineteenth century ships began the transition from being fuelled by coal to burning oil, the receptacles of course became “oil bunkers” (among sailors nearly always clipped to “bunker”) and as refining processes evolved, the fuel specifically produced for oceangoing ships came to be called “bunker oil”.

Bunker oil is “dirty stuff”, a highly viscous, heavy fuel oil which is essentially the residue of crude oil refining; it’s that which remains after the more refined and volatile products (gasoline (petrol), kerosene, diesel etc) have been extracted.  Until late in the twentieth century, the orthodox view of economists was its use in big ships was a good thing because it was a product for which industry had little other use and, as essentially a by-product, it was relatively cheap.  It came in three flavours: (1) Bunker A: Light fuel oil (similar to a heavy diesel), (2) Bunker B: An oil of intermediate viscosity used in engines larger than marine diesels but smaller than those used in the big ships and (3) Bunker C: Heavy fuel oil used in container ships and such which use VLD (very large displacement), slow running engines with a huge reciprocating mass.  Because of its composition, Bucker C especially produced much pollution and although much of this happened at sea (unseen by most but with obvious implications), when ships reached harbor to dock, all the smoke and soot became obvious.  Over the years, the worst of the pollution from the burning of bunker oil greatly has been reduced (the work underway even before the Greta Thunberg (b 2003) era), sometimes by the simple expedient of spraying a mist of water through the smoke.

Floor-plans of the upper (Vorbunker) and lower (Führerbunker) levels of the structure now commonly referred to collectively as the Führerbunker.

History’s most infamous bunker remains the Berlin Führerbunker in which Adolf Hitler (1889-1945; Führer (leader) and German head of government 1933-1945 & head of state 1934-1945) spent much of the last few months of his life.  In the architectural sense there were a number of Führerbunkers built, one at each of the semi-permanent Führerhauptquartiere (Führer Headquarters) created for the German military campaigns and several others built where required but it’s the one in Berlin which is remembered as “the Führerbunker”.  Before 1944 when the intensification of the air raids by the RAF (Royal Air Force) and USAAF (US Army Air Force) the term Führerbunker seems rarely to have been used other than by the architects and others involved in their construction and it wasn’t a designation like Führerhauptquartiere which the military and other institutions of state shifted between locations (rather as “Air Force One” is attached not to a specific airframe but whatever aircraft in which the US president is travelling).  In subsequent historical writing, the term Führerbunker tends often to be applied to the whole, two-level complex in Berlin and although it was only the lower layer which officially was designated as that, for most purposes the distinction is not significant.  In military documents, after January, 1945 the Führerbunker was referred to as Führerhauptquartiere.

Führerbunker tourist information board, Berlin, Germany.

Only an information board at the intersection of den Ministergärten and Gertrud-Kolmar-Straße, erected by the German Goverment in 2006 prior to that year's FIFA (Fédération Internationale de Football Association (International Federation of Association Football)) World Cup now marks the place on Berlin's Wilhelmstrasse 77 where once the Führerbunker was located.  The Soviet occupation forces razed the new Reich Chancellery and demolished all the bunker's above-ground structures but the subsequent GDR (Deutsche Demokratische Republik (German Democratic Republic; the old East Germany) 1949-1990) abandoned attempts completely to destroy what lay beneath.  Until after the fall of the Berlin Wall (1961-1989) the site remained unused and neglected, “re-discovered” only during excavations by property developers, the government insisting on the destruction on whatever was uncovered and, sensitive still to the spectre of “Neo-Nazi shrines”, for years the bunker’s location was never divulged, even as unremarkable buildings (an unfortunate aspect of post-unification Berlin) began to appear on the site.  Most of what would have covered the Führerbunker’s footprint is now a supermarket car park.

The first part of the complex to be built was the Vorbunker (upper bunker or forward bunker), an underground facility of reinforced concrete intended only as a temporary air-raid shelter for Hitler and his entourage in the old Reich Chancellery.  Substantially completed during 1936-1937, it was until 1943 listed in documents as the Luftschutzbunker der Reichskanzlei (Reich Chancellery Air-Raid Shelter), the Vorbunker label applied only in 1944 when the lower level (the Führerbunker proper) was appended.  In mid January, 1945, Hitler moved into the Führerbunker and, as the military situation deteriorated, his appearances above ground became less frequent until by late March he rarely saw the sky,  Finally, on 30 April, he committed suicide.

Bunker Busters

Northrop Grumman publicity shot of B2-Spirit from below, showing the twin bomb-bay doors through which the GBU-57 are released.

Awful as they are, there's an undeniable beauty in the engineering of some weapons and it's unfortunate humankind never collectively has resolved exclusively to devote such ingenuity to stuff other than us blowing up each other.  That’s not a new sentiment, being one philosophers and others have for millennia expressed in various ways although since the advent of nuclear weapons, concerns understandably become heightened.  Like every form of military technology ever deployed, once the “genie is out of the bottle” the problem is there to be managed and at the dawn of the atomic age, delivering a lecture in 1936, the British chemist and physicist Francis Aston (1877–1945) (who created the mass spectrograph, winning the 1922 Nobel Prize in Chemistry for his use of it to discover and identify the isotopes in many non-radioactive elements and for his enunciation of the whole number rule) observed:

There are those about us who say that such research should be stopped by law, alleging that man's destructive powers are already large enough.  So, no doubt, the more elderly and ape-like of our ancestors objected to the innovation of cooked food and pointed out the great dangers attending the use of the newly discovered agency, fire.  Personally, I think there is no doubt that sub-atomic energy is available all around us and that one day man will release and control its almost infinite power.  We cannot prevent him from doing so and can only hope that he will not use it exclusively in blowing up his next door neighbor.

The use in June 2025 by the USAF (US Air Force) of fourteen of its Boeing GBU-57 (Guided Bomb Unit-57) Massive Ordnance Penetrators (MOP) bombs against underground targets in Iran (twelve on the Fordow Uranium Enrichment Plant and two on the Natanz nuclear facility) meant “Bunker Buster” hit the headlines.  Carried by the Northrop B-2 Spirit heavy bomber (built between 1989-2000), the GBU-57 is a 14,000 kg (30,000 lb) bomb with a casing designed to withstand the stress of penetrating through layers of reinforced concrete or thick rock.  “Bunker buster” bombs have been around for a while, the ancestors of today’s devices first built for the German military early in World War II (1939-1945) and the principle remains unchanged to this day: up-scaled armor-piercing shells.  The initial purpose was to produce a weapon with a casing strong enough to withstand the forces imposed when impacting reinforced concrete structures, the idea simple in that what was needed was a delivery system which could “bust through” whatever protective layers surrounded a target, allowing the explosive charge to do damage where needed rtaher than wastefully being expended on an outer skin.  The German weapons proved effective but inevitably triggered an “arms race” in that as the war progressed, the concrete layers became thicker, walls over 2 metres (6.6 feet) and ceilings of 5 (16) being constructed by 1943.  Technological development continued and the idea extended to rocket propelled bombs optimized both for armor-piercing and aerodynamic efficiency, velocity a significant “mass multiplier” which made the weapons still more effective.

USAF test-flight footage of Northrop B2-Spirit dropping two GBU-57 "Bunker Buster" bombs.

Concurrent with this, the British developed the first true “bunker busters”, building on the idea of the naval torpedo, one aspect of which was in exploding a short distance from its target, it was highly damaging because it was able to take advantage of one of the properties of water (quite strange stuff according to those who study it) which is it doesn’t compress.  What that meant was it was often the “shock wave” of the water rather than the blast itself which could breach a hull, the same principle used for the famous “bouncing bombs” used for the RAF’s “Dambuster” (Operation Chastise, 17 May 1943) raids on German dams.  Because of the way water behaved, it wasn’t necessary to score the “direct hit” which had been the ideal in the early days of aerial warfare.

RAF Bomber Command archive photograph of Avro Lancaster (built between 1941-1946) in flight with Grand Slam mounted (left) and a comparison of the Tallboy & Grand Slam (right), illustrating how the latter was in most respects a scaled-up version of the former.  To carry the big Grand Slams, 32 “B1 Special” Lancasters were in 1945 built with up-rated Rolls-Royce Merlin V12 engines, the removal of the bomb doors (the Grand Slam carried externally, its dimensions exceeding internal capacity), deleted front and mid-upper gun turrets, no radar equipment and a strengthened undercarriage.  Such was the concern with weight (especially for take-off) that just about anything non-essential was removed from the B1 Specials, even three of the four fire axes and its crew door ladder.  In the US, Boeing went through a similar exercise to produce the run of “Silverplate” B-29 Superfortresses able to carry the first A-bombs used in August, 1945. 

Best known of the British devices were the so called earthquake bombs”, the Tallboy (12,000 lb; 5.4 ton) & Grand Slam (22,000 lb, 10 ton) which, despite the impressive bulk, were classified by the War Office as “medium capacity”.  The terms “Medium Capacity” (MC) & “High Capacity” referenced not the gross weight or physical dimensions but ratio of explosive filler to the total weight of the construction (ie how much was explosive compared to the casing and ancillary components).  Because both had thick casings to ensure penetration deep into hardened targets (bunkers and other structures encased in rock or reinforced concrete) before exploding, the internal dimensions accordingly were reduced compared with the ratio typical of contemporary ordinance.  A High Capacity (HC) bomb (a typical “general-purpose bomb) had a thinner casing and a much higher proportion of explosive (sometimes over 70% of total weight).  These were intended for area bombing (known also as “carpet bombing”) and caused wide blast damage whereas the Tallboy & Grand Slam were penetrative with casings optimized for aerodynamic efficiency, their supersonic travel working as a mass-multiplier.  The Tallboy’s 5,200 lb (2.3 ton) explosive load was some 43% of its gross weight while the Grand Slam’s 9,100 lb (4 ton) absorbed 41%; this may be compared with the “big” 4000 lb (1.8 ton) HC “Blockbuster” which allocated 75% of the gross weight to its 3000 LB (1.4 ton) charge.  Like many things in engineering (not just in military matters) the ratio represented a trade-off, the MC design prioritizing penetrative power and structural destruction over blast radius.  The novelty of the Tallboy & Grand Slam was that as earthquake bombs, their destructive potential was able to be unleashed not necessarily by achieving a direct hit on a target but by entering the ground nearby, the explosion (1) creating an underground cavity (a camouflet) and (2) transmitting a shock-wave through the target’s foundations, leading to the structure collapsing into the newly created lacuna. 

The etymology of camouflet has an interesting history in both French and military mining.  Originally it meant “a whiff of smoke in the face (from a fire or pipe) and in figurative use it was a reference to a snub or slight insult (something unpleasant delivered directly to someone) and although the origin is murky and it may have been related to the earlier French verb camoufler (to disguise; to mask) which evolved also into “camouflage”.  In the specialized military jargon of siege warfare or mining (sapping), over the seventeen and nineteenth centuries “camouflet” referred to “an underground explosion that does not break the surface, but collapses enemy tunnels or fortifications by creating a subterranean void or shockwave”.  The use of this tactic is best remembered from the Western Front in World War I, some of the huge craters now tourist attractions.

Under watchful eyes: Grand Ayatollah Ali Khamenei (b 1939; Supreme Leader, Islamic Republic of Iran since 1989) delivering a speech, sitting in front of the official portrait of the republic’s ever-unsmiling founder, Grand Ayatollah Ruhollah Khomeini (1900-1989; Supreme Leader, Islamic Republic of Iran, 1979-1989).  Ayatollah Khamenei seemed in 1989 an improbable choice as Supreme Leader because others were better credentialed but though cautious and uncharismatic, he has proved a great survivor in a troubled region.

Since aerial bombing began to be used as a strategic weapon, of great interest has been the debate over the BDA (battle damage assessment) and this issue emerged almost as soon as the bunker buster attack on Iran was announced, focused on the extent to which the MOPs had damaged the targets, the deepest of which were concealed deep inside a mountain.  BDA is a constantly evolving science and while satellites have made analysis of surface damage highly refined, it’s more difficult to understand what has happened deep underground.  Indeed, it wasn’t until the USSBS (United States Strategic Bombing Survey) teams toured Germany and Japan in 1945-1946, conducting interviews, economic analysis and site surveys that a useful (and substantially accurate) understanding emerged of the effectiveness of bombing although what technological advances have allowed for those with the resources is the so-called “panacea targets” (ie critical infrastructure and such once dismissed by planners because the required precision was for many reasons rarely attainable) can now accurately be targeted, the USAF able to drop a bomb within a few feet of the aiming point.  As the phrase is used by the military, the Fordow Uranium Enrichment Plant is as classic “panacea target” but whether even a technically successful strike will achieve the desired political outcome remains to be seen.

Mr Trump, in a moment of exasperation, posted on Truth Social of Iran & Israel: “We basically have two countries that have been fighting so long and so hard that they don't know what the fuck they're doing."  Actually, both know exactly WTF they're doing; it's just Mr Trump (and many others) would prefer they didn't do it.

Donald Trump (b 1946; US president 2017-2021 and since 2025) claimed “total obliteration” of the targets while Grand Ayatollah Khamenei admitted only there had been “some damage” and which is closer to the truth should one day be revealed.  Even modelling of the effects has probably been inconclusive because the deeper one goes underground, the greater the number of variables in the natural structure and the nature of the internal built environment will also influence blast behaviour.  All experts seem to agree much damage will have been done but what can’t yet be determined is what has been suffered by the facilities which sit as deep as 80 m (260 feet) inside the mountain although, as the name implies, “bunker busters” are designed for buried targets and it’s not always required for blast directly to reach target.  Because the shock-wave can travel through earth & rock, the effect is something like that of an earthquake and if the structure sufficiently is affected, it may be the area can be rendered geologically too unstable again to be used for its original purpose.

Within minutes of the bombing having been announced, legal academics were being interviewed (though not by Fox News) to explain why the attacks were unlawful under international law and in a sign of the times, the White House didn't bother to discuss fine legal points like the distinction between "preventive & pre-emptive strikes", preferring (like Fox News) to focus on the damage done.  However, whatever the murkiness surrounding the BDA, many analysts have concluded that even if before the attacks the Iranian authorities had not approved the creation of a nuclear weapon, this attack will have persuaded them one is essential for “regime survival”, thus the interest in both Tel Aviv and (despite denials) Washington DC in “regime change”.  The consensus seems to be Grand Ayatollah Khamenei had, prior to the strike, not ordered the creation of a nuclear weapon but that all energies were directed towards completing the preliminary steps, thus the enriching of uranium to ten times the level required for use in power generation; the ayatollah liked to keep his options open.  So, the fear of some is the attacks, even if they have (by weeks, months or years) delayed the Islamic Republic’s work on nuclear development, may prove counter-productive in that they convince the ayatollah to concur with the reasoning of every state which since 1945 has adopted an independent nuclear deterrent (IND).  That reasoning was not complex and hasn’t changed since first a prehistoric man picked up a stout stick to wave as a pre-lingual message to potential adversaries, warning them there would be consequences for aggression.  Although a theocracy, those who command power in the Islamic Republic are part of an opaque political institution and in the struggle which has for sometime been conducted in anticipation of the death of the aged (and reportedly ailing) Supreme Leader, the matter of “an Iranian IND” is one of the central dynamics.  Many will be following what unfolds in Tehran and the observers will not be only in Tel Aviv and Washington DC because in the region and beyond, few things focus the mind like the thought of ayatollahs with A-Bombs.

Of the word "bust"

The Great Bust: The Depression of the Thirties (1962) by Jack Lang (left), highly qualified porn star Busty Buffy (b 1996, who has never been accused of misleading advertising, centre) and The people's champion, Mr Lang, bust of Jack Lang, painted cast plaster by an unknown artist, circa 1927, National Portrait Gallery, Canberra, Australia (right).  Remembered for a few things, Jack Lang (1876–1975; premier of the Australian state of New South Wales (NSW) 1925-1927 & 1930-1932) remains best known for having in 1932 been the first head of government in the British Empire to have been sacked by the Crown since William IV (1765–1837; King of the UK 1830-1837) in 1834 dismissed Lord Melbourne (1779–1848; prime minister of the UK 1834 & 1835-1841).

Those learning English must think it at least careless things can both be (1) “razed to the ground” (totally to destroy something (typically a structure), usually by demolition or incineration) and (2) “raised to the sky” (physically lifted upwards).  The etymologies of “raze” and “raise” differ but they’re pronounced the same so it’s fortunate the spellings vary but in other troublesome examples of unrelated meanings, spelling and pronunciation can align, as in “bust”.  When used in ways most directly related to human anatomy: (1) “a sculptural portrayal of a person's head and shoulders” & (2) “the circumference of a woman's chest around her breasts” there is an etymological link but these uses wholly are unconnected with bust’s other senses.

Bust of Lindsay Lohan in white marble by Stable Diffusion.  Sculptures of just the neck and head came also to be called “busts”, the emphasis on the technique rather than the original definition.

Bust in the sense of “a sculpture of upper torso and head” dates from the 1690s and was from the sixteenth century French buste, from the Italian busto (upper body; torso), from the Latin bustum (funeral monument, tomb (although the original sense was “funeral pyre, place where corpses are burned”)) and it may have emerged (as a shortened form) from ambustum, neuter of ambustus (burned around), past participle of amburere (burn around, scorch), the construct being ambi- (around) + urere (to burn),  The alternative etymology traces a link to the Old Latin boro, the early form of the Classical Latin uro (to burn) and it’s though the development in Italian was influenced by the Etruscan custom of keeping the ashes of the dead in an urn shaped like the person when alive.  Thus the use, common by the 1720s of bust (a clipping from the French buste) being “a carving of the “trunk of the human body from the chest up”.  From this came the meaning “dimension of the bosom; the measurement around a woman's body at the level of her breasts” and that evolved on the basis of a comparison with the sculptures, the base of which was described as the “bust-line”, the term still used in dress-making (and for other comparative purposes as one of the three “vital statistics” by which women are judged (bust, waist, hips), each circumference having an “ideal range”).  It’s not known when “bust” and “bust-line” came into oral use among dress-makers and related professions but it’s documented since the 1880s.  Derived forms (sometimes hyphenated) include busty (tending to bustiness, thus Busty Buffy's choice of stage-name), overbust & underbust (technical terms in women's fashion referencing specific measurements) and bustier (a tight-fitting women's top which covers (most or all of) the bust.

Benito Mussolini (1883-1945; Duce (leader) & Prime-Minister of Italy 1922-1943) standing beside his “portrait bust” (1926).

The bust was carved by Swiss sculptor Ernest Durig (1894–1962) who gained posthumous notoriety when his career as a forger was revealed with the publication of his drawings which he’d represented as being from the hand of the French sculptor Auguste Rodin (1840-1917) under whom he claimed to have studied.  Mussolini appears here in one of the subsequently much caricatured poses which were a part of his personality cult.  More than one of the Duce's counterparts in other nations was known to have made fun of some of the more outré poses and affectations, the outstretched chin, right hand braced against the hip and straddle-legged stance among the popular motifs. 

“Portrait bust” in marble (circa 1895) of (1815-1989; chancellor of the German Empire (the "Second Reich") 1871-1890) by the German Sculptor Reinhold Begas (1831-1911).

 In sculpture, what had been known as the “portrait statue” came after the 1690s to be known as the “portrait bust” although both terms meant “sculpture of upper torso and head” and these proved a popular choice for military figures because the aspect enabled the inclusion of bling such as epaulettes, medals and other decorations and being depictions of the human figure, busts came to be vested with special significance by the superstitious.  In early 1939, during construction of the new Reich Chancellery in Berlin, workmen dropped one of the busts of Otto von Bismarck by Reinhold Begas, breaking it at the neck.  For decades, the bust had sat in the old Chancellery and the building’s project manager, Albert Speer (1905–1981; Nazi court architect 1934-1942; Nazi minister of armaments and war production 1942-1945), knowing Adolf Hitler (1889-1945; Führer (leader) and German head of government 1933-1945 & head of state 1934-1945) believed the Reich Eagle toppling from the post-office building right at the beginning of World War I had been a harbinger of doom for the nation, kept the accident secret, hurriedly issuing a commission to the German sculptor Arno Breker (1900–1991) who carved an exact copy.  To give the fake the necessary patina, it was soaked for a time in strong, black tea, the porous quality of marble enabling the fluid to induce some accelerated aging.  Interestingly, in his (sometimes reliable) memoir (Erinnerungen (Memories or Reminiscences) and published in English as Inside the Third Reich (1969)), even the technocratic Speer admitted of the accident: “I felt this as an evil omen”.

The other senses of bust (as a noun, verb & adjective) are diverse (and sometimes diametric opposites and include: “to break or fail”; “to be caught doing something unlawful / illicit / disgusting etc”; “to debunk”; “dramatically or unexpectedly to succeed”; “to go broke”; “to break in (horses, girlfriends etc): “to assault”; the downward portion of an economic cycle (ie “boom & bust”); “the act of effecting an arrest” and “someone (especially in professional sport) who failed to perform to expectation”.  That’s quite a range and that has meant the creation of dozens of idiomatic forms, the best known of which include: “boom & bust”, “busted flush”, “dambuster”, “bunker buster”,  “busted arse country”, “drug bust”, “cloud bust”, belly-busting, bust one's ass (or butt), bust a gut, bust a move, bust a nut, bust-down, bust loose, bust off, bust one's balls, bust-out, sod buster, bust the dust, myth-busting and trend-busting.  In the sense of “breaking through”, bust was from the Middle English busten, a variant of bursten & bresten (to burst) and may be compared with the Low German basten & barsten (to burst).  Bust in the sense of “break”, “smash”, “fail”, “arrest” etc was a creation of mid-nineteenth century US English and is of uncertain inspiration but most etymologists seem to concur it was likely a modification of “burst” effected with a phonetic alteration but it’s not impossible it came directly as an imperfect echoic of Germanic speech.  The apparent contradiction of bust meaning both “fail” and “dramatically succeed” happened because the former was an allusion to “being busted” (ie broken) while the latter meaning used the notion of “busting through”.

Monday, June 9, 2025

Glaucus

Glaucus (pronounce gloh-kus)

(1) Bluish-green, grayish-blue, sea-colored (ie of certain seas) or a gleaming pale blue.

(2) Any member of the genus Glaucus of nudibranchiate mollusks, found in the warmer latitudes, swimming in the open sea, strikingly colored with blue and silvery white.  They’re known also as sea swallow, blue angel, blue glaucus, blue dragon, blue sea slug, blue ocean slug).  If offered the choice, the organisms presumably would prefer to be called swallows, angels or dragons rather than slugs.

(3) A desert lime (Citrus glauca), a thorny shrub species endemic to semi-arid regions of Australia.

From the Ancient Greek γλαυκός (glaukós) (the γλαῦκος (glaûkos) was an edible grey fish although the species is uncertain (perhaps the derbio)) and was taken up by the Medieval Latin as glaucus (bright, sparkling, gleaming” and “bluish-green).  There may be an Indo-European root but no link has ever been found and despite the similarity, other words used to denote gleaming or shimmering light and colors (glow, gleam etc), there’s no known etymological link and it may have been a substratum word from Pre-Greek.  The eighth century BC poet Homer used the Greek glaukos to describe the sea as “gleaming, silvery”, apparently without any suggestion of a specific color but later writers adopted it with a sense of “greenish” (of olive leaves) and “blue; gray” (of eyes).  In English, the adjective glaucous dates from the 1670s and was used to refer to shades of bluish-green or gray; it’s a popular form in botany and ornithology, describing surfaces with a powdery or waxy coating that gives a pale blue-gray appearance.  In fashion, the vagueness of glaucus (especially the adjective glaucous) makes it handy because it can be used to describe eyes or fabrics neither quite blue nor green yet really not suited to being called turquoise, teal, aqua etc.  Glaucus is a noun & adjective; the noun plural is glaucuses.

Translators seem to believe Homer's glauk-opis Athene (Athena Glaukopis) meant “bright-eyed” rather than “gray-eyed” goddess; it was an epithet emphasizing her intelligence and wisdom, the construct being glau(kos) (gleaming, silvery; bluish-green; grey) + opsis (eye; face).  The word γλαύξ (glaux) (little owl) may have been related and linked to the bird’s distinctive, penetrating stare but it may also be from a pre-Greek source.  Owls do however sometimes appear with the goddess in Greek art and, like her, became a symbol of wisdom and intelligence.  The other epithets applied to Athena included Ophthalmitis and Oxyderkous, both references to her sharp, penetrating gaze.  As a descriptor of color, glaucus was applied widely including to eyes, the sea, the sky or fabrics and was used of shining surfaces.  The descendants include the Catalan glauc, the English glaucous, the French glauque, the Romanian glauc, the Italian glauco, the Portuguese glauco, the Romanian glauc and the Spanish glauco.  The Middle English glauk (bluish-green, gray) was in use as late as the early fifteenth century.

Renaissance-era engraving of Athena, the Ancient Greek goddess of wisdom, warfare, and craft, depicted in Corinthian helmet with spear and clothed in a long πέπλος (péplos); her aegis (shield or breastplate), bearing the Gorgon's head, rests nearby.  Athena’s sacred bird, the Athene noctua (little owl) is perched atop a pile of books, symbolizing knowledge & wisdom while the creature at her feet is the chthonic serpent Erichthonius which she raised, used often to stand for the triumph of reason over chaos, thus appearing also as the sacred serpent which protected the Acropolis.  The Greek Inscription on the banner reads: ΜΟΧΘΕΙΝ ΑΝΑΓΚΗ ΤΟΥΣ ΘΕΛΟΝΤΑΣ ΕΥ ΠΡΑΤΤΕΙΝ (Those who wish to do well must undergo toil) a classical aphorism often suggested as being a paraphrasing lines from Pindar or Isocrates, extolling effort and virtue.

In the myths of Antiquity there were many tales of Glaucus and in that the character was not unusual, the figures in the stories sometimes differing in details like parentage, where they lived, the lives they led and even whether they were gods or mortals; sometimes the lives depicted bore little similarity to those in other tales.  The myths in ancient Greece were not a fixed canon in the modern Western literary tradition; they were for centuries passed down orally for centuries before being written and in different regions a poet or dramatist was likely to tell it differently.   That was not just artistic licence because the stories could be a product people would pay to hear and content providers needed new product.  Additionally, as is a well-documented phenomenon when information is passed on orally, over generations, the “Chinese whispers problem” occurs and things, organically, can change.

Lindsay Lohan’s in glaucous (in the Medieval Latin sense of gleaming as well as the color) John Galliano satin gown, worn with Santoni stilettos, Irish Wish (Netflix, 2024) premiere, Paris Theater, New York City, March, 2024.

Nor was there the modern conception of IP (intellectual property) or copyright in the characters, the myths “belonging” literally to all as a shared public cultural heritage.  Were a poet (Ovid, Homer, Hesiod etc) to “re-imagine” an old myth or use well known characters to populate a new plot, that wasn’t plagiarization but simply a creative act in interpretation or reshaping.  There were social and political determinisms in all this: We now refer casually to “Ancient Greece” but it was not a unitary state (a la modern Greece) but an aggregation of city-states with their own distinct cults, local legends and literary traditions.  So, in one region Glaucus might have been depicted as a sea-god while somewhere to the south he was a warrior; a tragedian might make Glaucus tragic, a philosopher might use him as an allegorical device and a poet might map him onto a formulaic tale of jealousy, transformation and redemption.  The best comparison is probably the fictional characters which have entered public domain (as Mickey Mouse recently achieved) and thus become available for anyone to make of what they will.  To be generous, one might suggest what the AI (artificial intelligence) companies now wish to be made lawful (vacuuming up digitized copyright material to train their LLMs (large language models) for commercial gain while not having to pay the original creators or rights holders) is a return to the literary practices of antiquity.

Lindsay Lohan’s eyes naturally (left) are in the glaucus range but with modern contact lens (right), much is possible.

So it wasn’t so much that writers felt free to adapt myths to suit their purposes but rather it would never have occurred to them there was anything strange in doing exactly that.  Significantly, any author was at any time free to create a wholly new cast for their story but just as movie producers know a film with “bankable” stars has a greater chance of success than one with talented unknowns, the temptation must have been to avoid risking market resistance and “stick to the classics”.  Additionally, what’s never been entirely certain is the extent to which the poets who wrote down what they heard were inclined to “improve” things.  The myths were in a sense entertainment but they were often also morality tales, psychological studies or statements of political ideology, a medium for exploring fate, identity, love, betrayal, divine justice and other vicissitudes of life.  The very modern notion of “authorship” would have been unfamiliar in Antiquity, a ποιητής (author; poet) being someone who “shaped” rather than “owned” them and Homer (who may not have been a single individual) was revered not because he “made up” the Trojan War, but because masterfully he recounted it, just as now historians who write vivid histories are valued. 

Some of the many lives of Glaucus (Γλαύκος)

(1) He was the son of Antenor who helped Paris abduct Helen and to punish him, his father drove him out.  He fought against the Greeks, and was said sometimes to have been slain by Agamemnon but the more common version is he was saved by Odysseus and Menelaus; as the son of Antenor, who was bound to them by ties of friendship.

(2) He was the son of Hippolochus and grandson of Bellerophon and with his cousin Sarpedon, he commanded the Lycian contingent at Troy.  In the fighting around the city, he found himself face to face with Diomedes but both recalled their families were bound by ties of friendship so the two exchanged weapons, Diomedes of bronze and Glaucus of gold.  Later, when Sarpedon was wounded, he went to assist him, but was stopped by Teucer, wounded and forced to retire from the fray.  Apollo cured Glaucus in time to recover Sarpedon's body, though he was unable to stop the Greeks stripping the corpse of its arms.  Glaucus was killed during the fight for the body of Patroclus by Ajax and on Apollo's order his body was carried back to Lycia by the winds.

(3) He was the son of Sisyphus and succeeded his father to the throne of Ephyra, which later became Corinth.  Glaucus took part in the funeral games of Pelias but was beaten in the four-horse chariot race by Iolaus; after this his mares ate him alive after being maddened either by the water of a magic well, or as a result of Aphrodite's anger, for in order to make his mares run faster Glaucus refused to let them breed, and so offended the goddess.  In another legend, this Glaucus drank from a fountain which conferred immortality. No one would believe that he had become immortal, however, so he threw himself into the sea, where he became a sea-god and every sailor who cast a gaze upon him was assured an early death.

(4) He was a sea-deity.  Glaucus was a fisherman standing on the shore when he noticed if he laid his catch upon a certain herb-covered meadow, the fish miraculously were restored to life and jumped back into the sea. Curious, he tasted the herb himself and was seized by an irresistible urge to dive into the waters where the sea goddesses cleansed him of his remaining traces of mortality.  With that, he assumed a new form, his shoulders grew broader and his legs became a fish’s tail, his cheeks developed a thick beard (tinted green like the patina of bronze) and he became a part of the oceanic pantheon.  He also received the gift of prophecy to become a protector of sailors, often giving oracles and wisdom drawn from the sea.

Glaucus et Scylla (1726), oil on canvas by Jacques Dumont le Romain (1704-1781), (Musée des Beaux-Arts de Troyes). 

(5) Virgil made him the father of the Cumaean Sibyl and he appeared to Menelaus when the latter was returning from Troy; in some traditions he is said to have built the Argo and to have accompanied the ship on its voyage.  Glaucus courted Scylla unsuccessfully, and also tried to win the favours of Ariadne when Theseus abandoned her on Naxos. In that quest he failed but Dionysus included him in his train when the god took her away and made her his wife.

(6) He was the son of Minos and Pasiphae and while still a child he was chasing a mouse when he fell into a jar of honey and drowned.  When Minos finally found his son's corpse, the Curetés told him the child could be restored to life by the man who could best describe the colour of a certain cow among his herds which changed its colour three times a day.  It first became white, then red and finally became black.  Minos asked all the cleverest men in Crete to describe the colour of the cow and it was Polyidus who answered that the cow was mulberry-coloured, for the fruit is first white, turns red, and finally goes black when ripe. Minos felt that Polyidus had solved the problem and told him to bring Glaucus back to life, shutting him up with Glaucus' body.  Polyidus was at his wits' end, until he saw a snake make its way into the room and go over towards the body. He killed the serpent but soon a second came in and, seeing the first lying dead, went out before returning carrying in its mouth a herb with which it touched its companion.  Immediately, the snake was restored to life so Polyidus rubbed this herb on Glaucus, who revived at once.  Minos, however, was still not satisfied.  Before allowing Polyidus to return to his fatherland he demanded that the soothsayer should teach Glaucus his art.  This Polyidus did, but when he was finally allowed to go, he spat into his pupil's mouth, and Glaucus immediately lost all the knowledge he had just acquired.  In other versions of the legend, it was Asclepius, not Polyidus, who brought Glaucus back to life.

Tuesday, March 4, 2025

Decapitate

Decapitate (pronounced dih-kap-i-teyt)

(1) To cut off the head; to behead.

(2) Figuratively, to oust or destroy the leadership or ruling body of a government, military formation, criminal organization etc.

1605–1615: From the fourteenth century French décapiter, from the Late Latin dēcapitātus, past participle of dēcapitāre, the construct being - + capit- (stem of caput (head), genitive capitis), from the Proto-Italic kaput, from the Proto-Indo-European káput- (head) + -ātus.  The Latin prefix dē- (off) was from the preposition (of, from); the Old English æf- was a similar prefix.  The Latin suffix -ātus was from the Proto-Italic -ātos, from the primitive Indo-European -ehtos.  It’s regarded as a "pseudo-participle" and perhaps related to –tus although though similar formations in other Indo-European languages indicate it was distinct from it already in early Indo-European times.  It was cognate with the Proto-Slavic –atъ and the Proto-Germanic -ōdaz (the English form being -ed (having).  The feminine form was –āta, the neuter –ātum and it was used to form adjectives from nouns indicating the possession of a thing or a quality.  The English suffix -ate was a word-forming element used in forming nouns from Latin words ending in -ātus, -āta, & -ātum (such as estate, primate & senate).  Those that came to English via French often began with -at, but an -e was added in the fifteenth century or later to indicate the long vowel.  It can also mark adjectives formed from Latin perfect passive participle suffixes of first conjugation verbs -ātus, -āta, & -ātum (such as desolate, moderate & separate).  Again, often they were adopted in Middle English with an –at suffix, the -e appended after circa 1400; a doublet of –ee.  Decapitate, decapitated & decapitating are verbs, decapitation & decapitator are nouns; the common noun plural is decapitations.

Lindsay Lohan gardening with a lopper in her gloved hands, decapitation a less demanding path to destruction than deracination, New York City, May, 2015.  She appears to be relishing the task.

As a military strategy, the idea of decapitation is as old as warfare and based on the effective “cut the head off the snake”.  The technique of decapitation is to identify the leadership (command and control) of whatever structure or formation is hostile and focus available resources on that target.  Once the leadership has been eliminated, the effectiveness of the rest of the structure should be reduced and the idea is applied also in cyber warfare although in that field, target identification can be more difficult.  The military’s decapitation strategy is used by many included law enforcement bodies and can to some extent be applied in just about any form of interaction which involves conflicting interests.  The common English synonym is behead and that word may seem strange because it means “to take off the head” where the English word bejewel means “to put on the jewels”.  It’s because of the strange and shifting prefix "be-".  Behead was from the Middle English beheden, bihefden & biheveden, from the Old English behēafdian (to behead).  The prefix be- however evolved from its use in Old English.  In modern use it’s from the Middle English be- & bi-, from the Old English be- (off, away), from the Proto-Germanic bi- (be-), from the Proto-Germanic bi (near, by), the ultimate root the primitive Indo-European hepi (at, near) and cognate be- in the Saterland Frisian, the West Frisian, the Dutch, the German & Low German and the Swedish.  When the ancestors of behead were formed, the prefix be- was appended to create the sense of “off; away” but over the centuries it’s also invested the meanings “around; about” (eg bestir), “about, regarding, concerning” (eg bemoan), “on, upon, at, to, in contact with something” (eg behold), “as an intensifier” (eg besotted), “forming verbs derived from nouns or adjectives, usually with the sense of "to make, become, or cause to be" (eg befriend) & "adorned with something" (eg bejewel)).

A less common synonym is decollate, from the Latin decollare (to behead) and there’s also the curious adjective decapitable which (literally “able or fit to be decapitated”) presumably is entirely synonymous with “someone whose head has not been cut off” though not actually with someone alive, some corpses during the French Revolution being carted off to be guillotined, the symbolism of the seemingly superfluous apparently said to have been greeted by the mob "with a cheer".  Just as pleasing though less bloody were the Citroën cabriolets crafted between 1958-1974 by French coachbuilder Henri Chapron (1886-1978).

1971 Citroën DS21 Décapotable Usine with non-standard interior including bespoke headrests in the style used on some Jensen Interceptors.

Produced between 1955-1975, the sleek Citroën DS must have seemed something from science fiction to those accustomed to what was plying the roads outside but although it soon came to be regarded as something quintessentially French, the DS was actually designed by an Italian.  In this it was similar to French fries (invented in Belgium) and Nicolas Sarközy (b 1955; President of France 2007-2012), who first appeared on the planet the same year as the shapely DS and he was actually from here and there.  It was offered as the DS and the lower priced ID, the names a play on words, DS in French pronounced déesse (goddess) and ID idée (idea).  The goddess nickname caught on though idea never did.

Citroën Cabriolet d'Usine production count, 1960-1971.

Henri Chapron had attended the Paris Auto Salon when the DS made its debut and while Citroën had planned to offer a cabriolet, little had been done beyond some conceptual drawings and development resources were instead devoted to higher-volume variants, the ID (a less powerful DS with simplified mechanicals and less elaborate interior appointments) which would be released in 1957 and the Break (a station wagon marketed variously the Safari, Break, Familiale or Wagon), announced the next year.  Chapron claims it took him only a glance at the DS in display for him instantly to visualise the form his cabriolet would take but creating one proved difficult because such was the demand Citroën declined to supply a partially complete platform, compelling the coach-builder to secure a complete car from a dealer willing (on an undisclosed basis) to “bump” his name up the waiting list while he worked on the blueprints.  It wasn’t until 1958 Carrosserie Chapron presented their first DS cabriolet, dubbed La Croisette, named after the emblematic costal boulevard of Cannes and while initially it wasn’t approved by the factory (compelling Chapron to continue buying complete cars from dealers), it was obvious to Citroën’s engineers that they’d been presented with a shortcut to production.  Accordingly, Chapron designed a DS cabriolet suited to series production (as opposed to his bespoke creations) and that meant using the longer wheelbase platform of the Break, chosen because it was structurally enhanced to cope with the loads station wagons carry.  Beginning in 1960, these (in ID & DS versions) were the approved Cabriolets d'Usine, distributed until 1971 through Citroën’s dealer network, complete with a factory warranty.

1964 Citroën DW19 Décapotable Usine.  For statistical purposes the DWs are included in the DS production count)

The DS and ID are well documented in the model's history but there was also the more obscure DW, built at Citroën's UK manufacturing plant in the Berkshire town Slough which sits in the Thames Valley, some 20 miles west of London.  The facility was opened in February 1926 as part of the Slough Trading Estate (opened just after World War I (1914-1918)) which was an early example of an industrial park, the place having the advantage of having the required infrastructure needed because constructed by the government for wartime production and maintenance activities.  Citroën was one of the first companies to establish an operation on the site, overseas assembly prompted by the UK government's imposition of tariffs (33.3% on imported vehicles, excluding commercial vehicles) and the move had the added advantage of the right-hand-drive (RHD) cars being able to be exported throughout the British Empire under the “Commonwealth Preference”, arrangements, a low-tariff scheme, elements of which would endure as a final relic of the chimera of imperial free trade until 1973 when the UK joined the EEC (European Economic Community).  Unlike similar operations, which in decades to come would appear world-wide, the Slough Citroëns were not assembled from CKD (completely knocked down) kits which needed only local labor to bolt them together but used a mix of imported parts and locally produced components.  The import tariff was avoided if the “local content” (labor and domestically produced (although those sourced from elsewhere in the empire could qualify) parts) reached a certain threshold (measured by the total P&L (parts & labor) value in local currency); it was an approach many governments would follow and it remains popular today as a means of encouraging (and protecting) local industries and creating employment.  People able to find jobs in places like Slough would have been pleased but for those whose background meant they were less concerned with something as tiresome as paid-employment, the noise and dirt of factories seemed just a scar upon the “green and pleasant land” of William Blake (1757–1827).  In his poem Slough (1937), Sir John Betjeman (1906–1984; Poet Laureate 1972-1984), perhaps recalling Stanley Baldwin's (1867–1947; UK prime-minister 1923-1924, 1924-1929 & 1935-1937) “The bomber will always get through” speech (1932) welcomed the thought, writing:  Come friendly bombs and fall on Slough!  It isn’t fit for humans now”  Within half a decade, the Luftwaffe would grant his wish.

1964 Citroën DW19 Décapotable Usine.

During World War II (1939-1945), the Slough plant was requisitioned for military use and some 23,000 CMP (Canadian Military Pattern) trucks were built, civilian production resuming in 1946.  After 1955, Slough built both the ID and DS, the latter including the traditionally English leather trim and a wooden dashboard, a touch which some critics claimed was jarring among the otherwise modernist ambiance but the appeal was real because some French distributors imported the Slough dashboard parts for owners who liked the look.  The UK-built cars also used 12 volt Lucas electrics until 1963 and it was in that year the unique DW model was slotted in between the ID and DS.  Available only with a manual transmission and a simplified version of the timber veneer, the DW was configured with the ID's foot-operated clutch but used the more powerful DS engine, power steering and power brakes.  When exported, the DW was called DS19M and the "DW" label was applied simply because it was Citroën's internal code to distinguish (RHD) models built in the UK from the standard left-hand-drive (LHD) models produced in France.  Citroën assembly in Slough ended in February 1965 and although the factory initially retained the plant as a marketing, service & distribution centre, in 1974 these operations were moved to other premises and the buildings were taken over by Mars Confectionery.  Today, no trace remains of the Citroën works in Slough.

1963 Citroën Le Dandy & 1964 Citroën Palm Beach by Carrosserie Chapron.

Citroën DS by Carrosserie Chapron production count 1958-1974

Demand was higher at a lower price-point, as Citroën's 1325 cabriolets indicate but Carrosserie Chapron until 1974 maintained output of his more exclusive an expensive lines although by the late 1960s, output, never prolific, had slowed to a trickle.  Chapron’s originals varied in detail and the most distinguishing difference between the flavors was in the rear coachwork, the more intricate being those with the "squared-off" (sometimes called "finned" or "fin-tailed") look, a trick Mercedes-Benz had in 1957 adopted to modernize the 300d (W189, 1957-1963, the so called "Adenauer Mercedes", named after Konrad Adenauer (1876–1967; chancellor of the FRG (Federal Republic of Germany (the old West Germany) 1949-1963) who used several of the W186 (300, 300b, 300c, 1951-1957) & 300s models as his official state cars).  Almost all Chapron's customized DS models were built to special order between under the model names La Croisette, Le Paris, Le Caddy, Le Dandy, Concorde, Palm BeachLe Léman, Majesty, & Lorraine; all together, 287 of these were delivered and reputedly, no two were exactly alike.

Citroën Concorde coupés by Chapron: 1962 DS 19 (left) and 1965 DS 21 (right).  The DS 21 is one of six second series cars, distinguished by their “squared-off” rear wing treatment and includes almost all the luxury options Chapron had on their list including electric windows, leather trim, the Jaeger instrument cluster, a Radiomatic FM radio with automatic Hirschmann antenna, the Robergel wire wheel covers and the Marchal auxiliary headlights.

Alongside the higher-volume Cabriolets d'Usine, Carrosserie Chapron continued to produce much more expensive décapotables (the Le Caddy and Palm Beach cabriolets) as well as limousines (the Majesty) and coupés, the most numerous of the latter being Le Dandy, some 50 of which were completed between 1960-1968.  More exclusive still was another variation of the coupé coachwork, the Concorde with a more spacious cabin notably for the greater headroom it afforded the rear passengers.  Only 38 were built over five years and at the time they cost as much as the most expensive Cadillac 75 Limousine.

Bossaert's Citroën DS19-based GT 19 (1959-1964); the Marchal auxiliary headlights a later addition (top).

Others also built DS coupés & convertibles.  Between 1959-1964 Belgium-born Hector Bossaert produced more than a dozen DS coupés and what distinguished his was a platform shortened by 470 mm (18½ inches) and the use of a notchback roof-line.  Dubbed the Bossaert GT 19, the frontal styling was unchanged although curiously, the Citroën chevrons on the rear pillars were rotated by 90°; apart from the GT 19 Bossaert script on the boot lid (trunk lid), they are the vehicle’s only external identification.  Opinion remains divided about the aesthetes of the short wheelbase (SWB) DSs.  While it’s conceded the Chapron coupés & cabriolets do, in terms of design theory, look “unnaturally” elongated, the lines somehow suit the machines and the word most often used is “elegant” whereas the SWB cars do seem stubby and obviously truncated although, had the originals never existed, perhaps the SWB would look more "natural".  The consensus seems to be the GT 19 was the best implementation of the SWB idea, helped also by it being 70 mm (2¾ inches) lower than the donor DS and perhaps that would be expected given the design was by the Italian Pietro Frua (1913-1983).  Bossaert also increased the power.  Although the hydro-pneumatic suspension and slippery aerodynamics made the DS a fine high-speed cruiser, the 1.9 litre (117 cubic inch) four cylinder engine was ancient and inclined to be agricultural if pushed; acceleration was not sparking.  Bossaert thus offered “tuning packages” which included the usual methods: bigger carburetors & valves, and more aggressive camshaft profile and a higher compression ratio, all of which transformed the performance from “mediocre” to “slightly above average”.

The one-off Bossaert GT 19 convertible (left) and the one off 1966 Citroën DS21-based Bossaert cabriolet (right).

Demand was limited by the price; a GT 19 cost more than double that of a DS and the conversion was more than a Jaguar so one really had to be prepared to pay for the exclusivity.  Additionally, when the Citroën management discovered someone in a garage was “hotting-up” their engines, it was made clear that would invalidate any warranty.  Most sources say only 13 were built but there were also two convertibles, one based on the GT 19 (though fitted with fared in headlights) and the other quite different, owing more to the Chapron Caddy; both remained one-offs.  Two of the GT 19 coupés and the later convertible survive.

Right-side clignotant (left) on 1974 Citroën DS23 Pallas (right).

On the DS & ID saloons, the clignotants (turn indicators; flashers) were mounted in a housing which was styled to appear as a continuation of the roof-gutter; it was touches like that which were a hint the lines of the DS were from the drawing board of an Italian, Flaminio Bertoni (1903–1964) who, before working in industrial design in pre-war Italy, had trained as a sculptor.  Citroën seems never to have claimed the placement was a safety feature and critics of automotive styling have concluded the flourish was added as part of the avant-garde vibe.  However, the way the location enhanced their visibility attracted the interest of those advocating things needed to be done to make automobiles safer and while there were innovations in “active safety” (seat-belts, crumple zones etc), there was also the field of “passive safety” and that included visibility; at speed, reducing a driver’s reaction time by a fraction of a second can be the difference between life and death and researchers concluded having a “third brake light” at eye level did exactly that.  So compelling was the case it was under the administration of Ronald Reagan (1911-2004; US president 1981-1989 and hardly friendly to new regulations) that in 1986 the US mandated the CHMSL (centre high mount stop lamp) but because the acronym lacked a effortless pronunciation the legislated term never caught on and the devices are known variously as “centre brake light”, “eye level brake light”, “third brake light”, “high-level brake light” & “safety brake light”.  Unintentionally, Citroën may have started something though it took thirty years to realize the implications.

Coincidently, in the same year the DS debuted, Rudimentary seat-belts first appeared in production cars during the 1950s but the manufacturers must have thought the public indifferent because their few gestures were tentative such as in 1956 when Ford had offered (as an extra-cost option) a bundle of safety features called the “Lifeguard Design” package which included:

(1) Padded dashboards (to reduce head injuries).

(2) Recessed steering wheel hub (to minimize chest injuries).

(3) Seat belts (front lap belts only)

(4) Stronger door latches (preventing doors flying open in a crash)

(5) Shatter-resistant rear-view mirror (reducing injuries caused by from broken glass).

The standard features included (1) the Safety-Swivel Rear View Mirror, (2) the Deep-Center Steering Wheel with recessed post and bend-away spokes and (3) Double-Grip Door Latches with interlocking striker plate overlaps; Optional at additional cost were (4) Seat Belts (single kit, front or rear, color-keyed, nylon-rayon with quick one-handed adjust/release aluminium buckle)  (US$5).  There were also "bundles", always popular in Detroit.  Safety Package A consisted of a  Padded Instrument Panel & Padded Sun Visors (US$18) while Safety Package B added to that Front-Seat Lap Seat Belts (US$27).  On the 1956 Thunderbird which used a significantly different interior design, the options were (1) the Lifeguard Padded Instrument Panel (US$22.65), (2) Lifeguard Padded Sun Visors (US$9) and (3) Lifeguard Seat Belts (US$14).  Years later, internal documents would be discovered which revealed conflict within the corporation, the marketing department opposed to any mention of "safety features" because that reminded potential customers of car crashes; they would prefer they be reminded of new colors, higher power, sleek new lines and such.  So, little was done to promote the “Lifeguard Design”, public demand was subdued and the soon the option quietly was deleted from the list.

The rising death-toll and complaints from the insurance industry however meant the issue of automotive safety re-surfaced in the 1960s and the publication by lawyer Ralph Nadar (b 1934) of the book Unsafe at Any Speed (1965) which explored the issue played a part in triggering what proved to be decades of legislation which not even the efforts and money of Detroit's lobbyists could stop although some delays in implementation were achieved and there was the odd victory (such as the survival of the convertible and ironically, that was a matter about which Detroit was at the time mostly indifferent).  They could however delay things here and there and although it was a minor and temporary victory, the matter of the CHMSL was kicked down the road until 1986.  The executives in Detroit were (and remain) "slippery slide) (or "thin end of the wedge") theorists in that they thought if they agreed to some innocuous suggestion from government then that would encourage edicts both more onerous and expensive to implement.  History proved them in that correct but the intriguing thing was that more than a decade earlier, the industry had gone beyond the the SHMSL and of its own volition offered DHMSLs (high mount stop lamps), one division of General Motors (GM) even making the fittings standard equipment on one model.

1970 Ford Thunderbird brochure (left) and 1972 Oldsmobile Toronado (right).

In 1969 Ford added “High-Level Taillamps, eye level warning to following drivers” to the option list for the 1970 Thunderbird.  What that described was two brake lights fitted on either side of the rear-window and being a update of a model introduced for 1967, the devices were “bolt-ons” rather than being integrated into the structure.  As with the “Lifeguard Design” of 1956, demand was low, customers more prepared to pay for bigger engines and “dress up” options than safety features.  GM’s Oldsmobile solved the problem of low demand by making the DHMSLs standard equipment on the Toronado, their big PLC (personal luxury coupe).  Being a new body, the opportunity was taken to integrate them into the structure and they sat below the rear window.

1987 Mercedes-Benz 560 SL (left), 1989 Mercedes-Benz 560 SL (centre) and 2001 Mercedes-Benz SL 600 (right).

When in 1971 the Mercedes-Benz 350 SL (R107, 1971-1989) was introduced, it occurred to no one it would still be in production in 1989, the unplanned longevity the product of an uncertainty about whether the US government would outlaw convertibles.  The by then 15 year old roadster thus had to have a CHMSL added when the legislation came into effect and it’s suspected the project was handed to the same team responsible for making the company’s headlights comply with US law.  What they did was “bolt on” to the trunk (boot) lid a lamp which seemed to suggest the design brief had been: “make it stick out like a sore thumb”.  If so, they succeeded and while the revised model (1988-1989) used a smaller unit, it was little more than a slightly less small digit; frankly, Ford did a better job with the 1970 T-bird although, in fairness, the Germans didn’t have a fixed rear window with which to work.  When the R129 roadster (1989-2001) was developed, the opportunity was taken (al la the 1971 Oldsmobile Toronado) to integrate a CHMSL into the lid.

1989 Porsche 911 (930) Turbo Cabriolet (left) and 2004 Porsche 911 (996) Turbo Cabriolet.

In 1986, the Porsche 911 had been around longer even than the Mercedes-Benz R107.  First sold in 1964 and updated for 1974 with (US mandated) big bumpers, in 1986 it became another example of a “bolt on” solution for the CHMSL rule but unlike the one used on the R107, on the 911 there’s a charm to the lamp sitting atop a stalk, like that of some crustaceans, molluscs, insects and stalk-eyed imaginings from SF (science fiction).  All the “bolt-ons” existed because while there is nothing difficult about the engineering of a CHMSL, many would be surprised to learn just how expensive it would have been for a manufacturer to integrate such a thing into an existing structure; a prototype or mock-up would be quick and cheap but translating that into series production would have involved a number of steps and the costs would have been considerable.  That’s why there were so many “bolt-on” CHMSLs in the late 1980s.  Interestingly, when the next 911 (964 1989-1994) was released, on the coupe’s the CHMSL was re-positioned at the top of the rear window while the cabriolets retained the stalk.  The factory persevered with this approach for a while and it was only later the unit became integrated into the rear bodywork (with many variations).  Some still prefer the look of the stalk.

For manufacturers and drivers alike, from the mid-1980s onward, CHMSLs became omnipresent yet despite their conspicuous visibility, were soon so unexceptional as to be in a sense unnoticed; they became just part of the orthodoxy of design language.  Researchers however remained interested in the brake light and as early as the 1970s some were advocating the introduction of front brake lights (FBL), obviously a concept difficult to test in the wild because such things were almost universally unlawful.  In test labs though their potential effectiveness could be studied by using simulators which compared a driver’s reaction time to the sight of a braking vehicle with and without FBLs and, unsurprisingly, where a warning light was present, reactions were faster and that can be of consequence in situations of potential impact at speed, a vehicle in a second travelling a considerable distance.  The sort of statistical modelling applied to try to quantify the potential benefits FBLs might deliver can be criticized but there has been more than one research project and while the details have differed, all suggested there would be a reduction in vehicle crashes and logically, that should translate into less damage and fewer deaths & injuries.  Paradoxically, were that to be realized, one direct effect would be a reduction in GDP (gross domestic product) because the economic activity generated (in industries such as medicine, car repair, funeral homes etc) wouldn’t happen although some of all of that should be off-set by the ongoing workforce participation by those not killed or hospitalized.

What FBLs would do is made it easier for drivers to detect another vehicle’s braking from front and side angles, making them more likely to react if a potential situation drama is thus anticipated.  Obviously red lights at the front would be a bad idea (although some service vehicles are so equipped) and clear or amber lens could be ambiguous so the usual suggestion is green, previously used only by a small number of medical personnel. There were concerns about the use of green because it was speculated there might (especially in conditions of low visibility) be potential for them to be confused with the green (Go) of traffic signals used at intersections but to assess the veracity of that may require testing in real-world conditions.

1968 Citroën DS20 Break (left) and 1958 DeSoto Firesweep Explorer Station Wagon (right).

In 1958, a station wagon version of the DS & ID was released; because of historic regional variations in terminology, in different places it was marketed as the Break (France), Safari or Estate (UK), Station Wagon (North America) and Safari or Station Wagon (Australia) but between markets there were only detail differences.  Because of the top-hinged tailgate, to mount the clignotants in the high positions used on the saloons would have been difficult so they were integrated into a vertical stack of three in a conventional location.  In style the lens and the modest “fins” in which they sat recalled the arrangement DeSoto in the US had made their signature since late 1955 although it’s unlikely the US design had much influence on what was for Citroën a pragmatic solution for a vehicle then regarded as having most appeal as a Commerciale.  The French certainly weren’t drawn to fins as macropterous as some Detroit had encouraged theirs to grow to by 1958.

Finettes: Bossaert's tail lights from the parts bin of Fiat (left) and BMC (right). 

Convertibles of course lack a roof so the clignotants couldn’t continue in their eye-catching place with topless coachwork and their placement on the DS & ID varied in accordance with how the rear coachwork was handled.  Bossaert took a conventional approach and emulated a look familiar on many European roadsters & cabriolets.  For the GT 19 the taillights (known as carrellos) came from the Fiat Pininfarina Coupé & Cabriolet (1959-1966), a vertical style which in the era appeared on a number of cars including Ferraris, Peugeots and Rovers.  For his other take on a convertible DS, Bossaert reached over the English Channel and from the BMC (British Motor Corporation) parts bin selected the units used by the Wolseley Hornet & Riley Elf (luxury versions of the Mini (1959-2000), built between 1969-1969 which, as well as the expected leather & burl walnut veneer trim, had an extended tail with distinctly brachypterous “finettes”).  The success of the Hornet & Elf in class-conscious England encouraged BMC in 1964 to go even more up-market and have their in-house coach-builder Vanden Plas produce a version of the Austin 1100 (ADO16, 1963-1974) and all the ADO16s until 1967 shared their taillights with the Hornet and Elf.  Although visually similar to those used between 1962-1970 on MG’s MGB (1962-1980) & MGC (1967-1969); they are different, the Hornet/Elf/ADO16 units being the Lucas L549 while the MGs used the L550.  Between 1961-1966, the MG Midget (1961-1980) used the L549 and between 1966-1970 the L550.

1970 Chapron Citroën DS20 Décapotable Usine (left), 1962 Chapron Citroën DS19 Concorde (with clignotants rouge, right) and 1965 Chapron Citroën DS21 Le Caddy (with clignotants ambre, right).

Chapron’s approach to clignotant placement varied with rear coachwork.  On the volume models officially supported by the factory, two small lens were fitted within chrome housings, mounted on opposite sides at the base of the soft-top.  For his more exclusive Le Caddy & Concorde with squared-off rear quarters (al la the “modernizing” look Mercedes-Benz applied to the 300 Adenauer W186, 1951-1957) to create the 300d (1957-1962)) Chapron re-purposed one of the existing taillights, using a still-lawful red lens on many although later models switched to amber.

1973 Citroën DS23 Pallas "landaulet" (in the style of that once used by the French president, left), 2010 Maybach 62 S Landaulet (to right), John Paul II (1920–2005; pope 1978-2005) in Papal 1965 Mercedes-Benz 300 SEL Landaulet (bottom left) and Pope Paul VI (1897-1978; pope 1963-1978) in 1966 Mercedes-Benz 600 Landaulet (bottom right).

From the moment it first was shown in 1955 the DS has intrigued and it’s the various convertibles which attract most attention.  To this day, the things remain a symbol which quintessentially is French and at least two have been converted into “full-roof” landaulets for tourists to be escorted around Paris.  The landaulet (a car with a removable roof which retains the side window frames) was a fixture on coach-building lists during the 1920s & 1930s but became rare in the post-war years; of late the only ones produced in any volume were the 59 Mercedes-Benz 600s (1963-1981) which came in “short” and “long” (though not full) roof versions although there was a revival, 22 Maybach 62 S Landaulets built between 2011-2022, one of which was even right-hand-drive (RHD).  Considering the price and specialized nature of the variant, that there were 22 made makes the Landaulet more a success than the unfortunate "standard" Maybachs which managed only some 3300 between 2002-2013.  The Papal Mercedes-Benz 300 SEL (W109) Landaulet was a gift from the factory but it was for years little used because the next year a very special 600 (W100) Pullman Landaulet was provided and this much more spacious limousine was preferred.  The papal 600 was unique in that it was one of the “high roof” state versions and fitted with longer rear doors, a “throne” in the rear compartment which, mounted on an elevated floor, could be raised or lowered as Hid Holiness percolated through crowed streets.  It was the latest in a long line of limousines and landaulets the factory provided for the Holy See and remains one of the best known; returned to the factory in 1985, it’s now on permanent display at the Mercedes-Benz museum in Stuttgart.  Use of the 600 became infrequent after the attempted assassination of John Paul II (1981).  As a stopgap, the 300 SEL quickly was armor-plated and used occasionally until the arrival of “Popemobiles” in which the pontiff sat in an elevated compartment with bullet-proof glass sides.  Despite that, Mercedes-Benz have since delivered two S-Class (a V126 & V140) landaulets to the Vatican.  Francis (b 1936; pope since 2013) has no taste for limousines or much else which is extravagant and prefers small, basic cars although to ensure security the bullet-proof Popemobiles remain essential and in 2024 Mercedes-Benz presented the Holy See with a fully-electric model, based on the new W465 G-Class.  The Vatican is planning to have transitioned to a zero-emission vehicle fleet by 2030.   

1974 Citroën DS23 Pallas: the one-off Australian “semi-phaeton”.

In Australia, someone created something really unique: a DS “semi-phaeton”.  While the definition became looser until eventually it became merely a model name which meant nothing beyond some implication of exclusivity & high price, the term “phaeton” (borrowed from the age of the horse-drawn buggy) referred to a vehicle with no top or side windows.  By the late 1930s, when last they were on the books as regular production models, the “phaetons” had gained folding tops and often removable side windows but they’d also lost market appeal and except for the odd few built for ceremonial purposes (the most memorable the three Chrysler Imperial Parade Phaetons built in 1952 and still occasionally used), there was no post-war revival.  The Australian creation was based on a 1974 DS23 Pallas and had no soft-top or rear-side windows but the front-side units remained operative.  The rear doors were changed to hinge from the rear (the so-called “suicide doors”; the external handles removed from all four), an indication the engineering was more intricate than many of the “four-door convertibles” made over the years by decapitating a sedan; the sales blurb did note the platform was “strengthened”, something essential when a structural component like a roof is removed.

The Citroën SM, a few of which were decapitated 

1972 Citroën SM (left) & 1971 Citroën SM Mylord by Carrosserie Chapron (right).  The wheels are the Michelin RR (roues en résine or résine renforcée (reinforced resin)) composites, cast using a patented technology invented by NASA for the original moon buggy.  The Michelin wheel was one-piece and barely a third the weight of the equivalent steel wheel but the idea never caught on, doubts existing about their long-term durability and susceptibility to extreme heat (the SM had inboard brakes).  

Upon release in 1971, immediately the Citroën SM was recognized as among the planet's most intricate and intriguing cars.  A descendant of the DS which in 1955 had been even more of a sensation, it took Citroën not only up-market but into a niche the SM had created, nothing quite like it previously existing, the combination of a large (in European terms), front-wheel-drive (FWD) luxury coupé with hydro-pneumatic suspension, self-centreing (Vari-Power) steering, high-pressure braking and a four-cam V6 engine, a mix unique in the world.  The engine had been developed by Maserati, one of Citroën’s recent acquisitions and the name acknowledged the Italian debt, SM standing for Systemé Maserati.  Although, given the size and weight of the SM, the V6 was of modest displacement to attract lower taxes (initially 2.7 litres (163 cubic inch)) and power was limited (181 HP (133 kW)) compared to the competition, such was the slipperiness of the body's aerodynamics that in terms of top speed, it was at least a match for most.

1973 Citroën SM with reproduction RR wheels in aluminium.

However, lacking the high-performance pedigree enjoy by some of that competition, a rallying campaign had been planned as a promotional tool.  Although obviously unsuited to circuit racing, the big, heavy SM didn’t immediately commend itself as a rally car; early tests indicated some potential but there was a need radically to reduce weight.  One obvious candidate was the steel wheels but attempts to use lightweight aluminum units proved abortive, cracking encountered when tested under rally conditions.  Michelin immediately offered to develop glass-fibre reinforced resin wheels, the company familiar with the material which had proved durable when tested under extreme loads.  Called the Michelin RR (roues resin (resin wheel)), the new wheels were created as a one-piece mold, made entirely of resin except for some embedded steel reinforcements at the stud holes to distribute the stresses.  At around 9.4 lb (4¼ kg) apiece, they were less than half the weight of a steel wheel and in testing proved as strong and reliable as Michelin had promised.  Thus satisfied, Citroën went rallying.

Citroën SM, Morocco Rally, 1971.

The improbable rally car proved a success, winning first time out in the 1971 Morocco Rally and further success followed.  Strangely, the 1970s proved an era of heavy cruisers doing well in the sport, Mercedes-Benz winning long-distance events with their 450 SLC 5.0 which was both the first V8 and the first car with an automatic transmission to win a European rally.  Stranger still, Ford in Australia re-purposed one of the Falcon GTHO Phase IV race cars which had become redundant when the programme was cancelled in 1972 and the thing proved surprisingly competitive during the brief periods it was mobile although the lack of suitable tyres meant repeatedly the sidewalls would fail; the car was written off after a serious crash.  The SM, GTHO & SLC proved a quixotic tilt and the sport went a different direction.  On the SM however, the resin wheels had proved their durability, not one failing during the whole campaign and encouraged by customer requests, Citroën in 1972 offered the wheels as a factory option although only in Europe; apparently the thought of asking the US federal safety regulators to approve plastic wheels (as they’d already been dubbed by the motoring press) seemed to the French so absurd they never bothered to submit an application.

1974 prototype Citroën SM with 4.0 V8.

Ambitious as it was, circumstances combined in a curious way that might have made the SM more remarkable still.  By 1973, sales of the SM, after an encouraging start had for two years been in decline, a reputation for unreliability already tarnishing its reputation but the first oil shock dealt what appeared to be a fatal blow; from selling almost 5000 in 1971, by 1974 production numbered not even 300.  The market for fast, thirsty cars had shrunk and most of the trans-Atlantic hybrids (combining elegant European coachwork with large, powerful and cheap US V8s), which had for more than a decade done good business as alternative to the highly strung British and Italian thoroughbreds, had been driven extinct.  Counter-intuitively, Citroën’s solution was to develop an even thirstier V8 SM and that actually made sense because, in an attempt to amortize costs, the SM’s platform had been used as the basis for the new Maserati Quattroporte but, bigger and heavier still, performance was sub-standard and the theory was a V8 version would transform both and appeal to the US market, then the hope of many struggling European manufacturers.

Recreation of 1974 Citroën SM V8 prototype.

Citroën didn’t have a V8; Maserati did but it was big and heavy, a relic with origins in racing and while its (never wholly tamed) raucous qualities suited the character of the sports cars and saloons Maserati offered in the 1960s, it couldn’t be used in something like the SM.  However, the SM’s V6 was a 90o unit and thus inherently better suited to an eight-cylinder configuration.  In 1974 therefore, a four litre (244 cubic inch) V8 based on the V6 (by then 3.0 litres (181 cubic inch)) was quickly built and installed in an SM which was subjected to the usual battery of tests over a reported 20,000 km (12,000 miles) during which it was said to have performed faultlessly.  Bankruptcy (to which the SM, along with some of the company's other ventures, notably the GZ Wankel programme, contributed) however was the death knell for both the SM and the V8, the prototype car scrapped while the unique engine was removed and stored, later used to create a replica of the 1974 test mule.

Evidence does however suggest a V8 SM would likely have been a failure, just compounding the existing error on an even grander scale.  It’s true that Oldsmobile and Cadillac had offered big FWD coupés with great success since the mid 1960s (the Cadillac at one point fitted with a 500 cubic inch (8.2 litre) V8 rated at what sounds an alarming 400 HP (300 kW)) but they were very different machines to the SM and appealed to a different market.  Probably the first car to explore what demand might have existed for a V8 SM was the hardly successful 1986 Lancia Thema 8·32 which used the Ferrari 308's 2.9 litre (179 cubic inch) V8 in a FWD platform.  Although well-executed within the limitations the configuration imposed, it was about a daft an idea as it sounds although it did hint at what a success a V8 Fiat 130 saloon (1969-1976) & coupé (1971-1977) might have been if sold with a Lancia badge.  Even had the V8 SM been all-wheel-drive (AWD) it would probably still have been a failure but it would now be remembered as a revolution ahead of its time.  As it is, the whole SM story is just another cul-de-sac, albeit one which has become a (mostly) fondly-regarded cult.

State Citroëns by Carrosserie Chapron: 1968 Citroën DS state limousine (left) and 1972 Citroën SM Présidentielle (right).

In the summer of 1971, after years of slowing sales, Citroën announced the end of the décapotable usine and Chapron’s business model suffered, the market for specialized coach-building, in decline since the 1940s, now all but evaporated.  Chapron developed a convertible version of Citroën’s new SM called the Mylord but, very expensive, it was little more successful than the car on which it was based; although engineered to Chapron’s high standard, fewer than ten were built.  Government contracts did for a while seem to offer hope.  Charles De Gaulle (1890–1970; President of France 1958-1969) had been aghast at the notion the state car of France might be bought from Germany or the US (it’s not known which idea he thought most appalling and apparently nobody bothered to suggest buying British) so, at his instigation, Chapron (apparently without great enthusiasm) built a long wheelbase DS Presidential model.

Size matters: Citroën DS Le Presidentielle (left) and LBJ era stretched Lincoln Continental by Lehmann-Peterson of Chicago (right).

Begun in 1965, the project took three years, legend having it that de Gaulle himself stipulated little more than it be longer than the stretched Lincoln Continentals then used by the White House (John Kennedy (JFK, 1917–1963; US president 1961-1963) was assassinated in Lincoln Continental X-100 modified by Hess and Eisenhardt) and this was achieved, despite the requirement the turning circle had to be tight enough to enter the Elysée Palace’s courtyard from the Rue du Faubourg Saint-Honoré and then pull up at the steps in a single maneuver.  Although size mattered on the outside, De Gaulle’s sense of “grandeur de la France” didn’t extend to what lay under the hood, Le Presidentielle DS retaining the 2.1 litre (133 cubic inch) 4 cylinder engine but he’d probably have scorned the 7.5 litre (462 cubic inch) V8 by then in Lincolns as typical American vulgarity.  As it was, although delivered to the Élysée in time for the troubles of 1968, Chapron’s DS was barely used by De Gaulle because he disliked the partition separating him from the chauffeur and he preferred either the earlier limousines built in the 1950s by Franay and Chapron (both based on the earlier Citroën Traction Avant 15/6) or a DS landaulet (with full-length folding roof) in which he could stand up and look down on the (hopefully) cheering crowds lining the road.

However, the slinky lines must have been admired because in 1972 Chapron was commissioned to supply two really big four-door convertible Le Presidentielle SMs as the state limousines for Le Général’s successor, Georges Pompidou (1911–1974; President of France 1969-1974).  First used for 1972 state visit of Elizabeth II (1926-2022; Queen of the UK and other places, 1952-2022), they remained in regular service until the inauguration of Jacques Chirac (1932–2019; President of France 1995-2007) in 1995, seen again on the Champs Elysees in 2004 during Her Majesty’s three-day state visit marking the centenary of the Entente Cordiale.

1972 Citroën SM Opera by Carrosserie Chapron (left) & 1973 Maserati Quattroporte II (right).  This is the Quattroporte which was slated to receive the V8 tested in the SM.

Despite that, state contracts for the odd limousine, while individually lucrative, were not a model to sustain a coach building business and a year after the Mylord was first displayed, Chapron inverted his traditional practice and developed from a coupé, a four-door SM called the Opera.  On a longer wheelbase, stylistically it was well executed but was heavy and both performance and fuel consumption suffered, the additional bulk also meaning some agility was lost.  Citroën was never much devoted to the project because they had in the works what was essentially their own take on a four-door SM, sold as the Maserati Quattroporte II (the Italian house having earlier been absorbed) but as things transpired in those difficult years, neither proved a success, only eight Operas and a scarcely more impressive thirteen Quattroporte IIs ever built.  The French machine deserved more, the Italian knock-off, probably not.  In 1974, Citroën entered bankruptcy, dragged down in part by the debacle which the ambitious SM had proved to be although there had been other debacles worse still.

That other quintessential symbol of France, Bridget Bardot (b 1934) in La Déesse with a lit Gitanes.

The combination of a car, a woman with JBF and a cigarette continued to draw photographers even after smoking ceased to be glamorous and became a social crime.  First sold in 1910, Gitanes production in France survived two world wars, the Great Depression, Nazi occupation but the regime of Jacques Chirac (1932–2019; President of France 1995-2007) proved too much and, following the assault on tobacco by Brussels and Paris, in 2005 the factory in Lille was shuttered.  Although Gitanes (and the sister cigarette Gauloise) remain available in France, they are now shipped from Spain and while in most of the Western world fewer now smoke, Gitanes Blondes retain a cult following.  Three years after the last SM left the factory, Henri Chapron died in Paris, his much down-sized company lingering on for some years under the direction of his industrious widow, the bulk of its work now customizing Citroën CXs.  Operations ceased in 1985 but the legacy is much admired and the décapotables remain a favorite of collectors and film-makers searching for something with which to evoke the verisimilitude of 1960s France.

Grave of Monsieur et Madame Arbelot, Cimetière du Père Lachaise, Paris.

In most circumstances, the sight of a husband staring at the decapitated head of his wife which he’s holding aloft before his eyes would be at least confronting and usually an indication he may have committed at least one offence but there is, carved in stone in a Parisian cemetery, one such decapitation which is romantic.  In the Cimetière du Père Lachaise (Rueil-Malmaison, Departement des Hauts-de-Seine) lie the graves of Fernand (Louis) Arbelot (1880-1942) and his wife Henriette Marie Louise Gicquel (1885-1967), the couple married in the city in August 1919.  It was during her funeral in 1967 they finally were reunited and the bronze statue of a recumbent Monsieur Arbelot holding in his hands the face of his beloved is a monument to his one wish when dying: to forever gaze upon the face of his wife.  The epitaph on the grave reads: Ils furent émerveillés du beau voyage qui les mena jusqu’au bout de la vie (They were amazed by the beautiful journey that led them to the end of life).

Epitaph on grave of Monsieur et Madame Arbelot, Cimetière du Père Lachaise, Paris.

Established in 1803 by Napoleon Bonaparte (1769–1821; leader of the French Republic 1799-1804 & Emperor of the French from 1804-1814 & 1815) and named after the Jesuit priest, Père François de la Chaise (1624–1709) who was confessor to Louis XIV (1638–1715; le Roi Soleil (the Sun King), King of France 1643-1715), at 40 hectares (100 acres), Père Lachaise remains Paris's largest cemetery and contains over a million internments.  Apart from the obvious matter of the many dead, it’s of interest to historians of town planning because as a piece of landscape architecture, it represented an change of approach from the old, over-crowded medieval churchyards in which corpses had for centuries be piled one atop the other.  Although not strictly true, the place has come to be regarded as the first “garden” or “landscape” cemetery and even the French admit there was influence from the eighteenth century country houses of the English aristocracy & landed gentry, the grounds of which were characterized by irregular, winding paths and picturesque gardens with a seemingly (and sometime literally) random, naturalistic approach to plantings.  It was also one of Napoleon’s more far-sighted decisions because although initially unpopular because of its distance from the city, Paris quickly expanded to “meet” it and the vast space made possible to purchase of individual plots, once a privilege available only to the rich.

Grave of Monsieur et Madame Arbelot, Cimetière du Père Lachaise, Paris.

Attracting each year more than four million tourists, Cimetière du Père Lachaise is one of the world’s more-visited graveyards and it features frequently on Instagram & TikTok, the favoured dead celebrities including the Irish author Oscar Wilde (1854-1900) although his memorial is now less photogenic because the carving of a sleeping winged sphinx had to be placed behind plexiglass to prevent the “theft of certain private parts” and protect it from the lipstick-covered kisses of devotees (left by women as well as men it’s said).  Other popular fan-graves include those of Polish composer Frédéric Chopin (1810–1849), US singer Jim Morrison (1943-1971), Italian artist of the Paris School Amedeo Modigliani (1884–1920), French singer Édith Piaf (1915–1963), French novelist Marcel Proust (1871–1922), US writer Gertrude Stein (1874–1946) and the French pioneer of sociology Auguste Comte (1798–1857).

Judith and the decapitation of Holofernes

In the Bible, the deuterocanonical books (literally “belonging to the second canon”) are those books and passages traditionally regarded as the canonical texts of the Old Testament, some of which long pre-date Christianity, some composed during the “century of overlap” before the separation between the Christian church and Judaism became institutionalized.  As the Hebrew canon evolved, the seven deuterocanonical books were excluded and on this basis were not included in the Protestant Old Testament, those denominations regarding them as apocrypha and they’re been characterized as such since.  Canonical or not, the relationship of the texts to the New Testament has long interested biblical scholars, none denying that links exist but there’s wide difference in interpretation, some finding (admittedly while giving the definition of "allusion" wide latitude) a continuity of thread, others only fragmentary references and even then, some paraphrasing is dismissed as having merely a literary rather than historical or theological purpose.

Le Retour de Judith à Béthulie (The Return of Judith to Bethulia) (1470) by Botticelli, (circa 1444-1510).

The Book of Judith exists thus in the Roman Catholic and Eastern Orthodox Old Testaments but is assigned (relegated some of the hard-liners might say) by Protestants to the apocrypha.  It is the tale of Judith (יְהוּדִית in the Hebrew and the feminine of Judah), a legendarily beautiful Jewish widow who uses her charms to lure the Assyrian General Holofernes to his gruesome death (decapitated by her own hand) so her people may be saved.  As a text, the Book of Judith is interesting in that it’s a genuine literary innovation, a lengthy and structured thematic narrative evolving from the one idea, something different from the old episodic tradition of loosely linked stories.  That certainly reflects the influence of Hellenistic literary techniques and the Book of Judith may be thought a precursor of the historical novel: A framework of certain agreed facts upon a known geography on which an emblematic protagonist (Judith the feminine form of the national hero Judah) performs.  The atmosphere of crisis and undercurrent of belligerence lends the work a modern feel while theologically, it’s used to teach the importance of fidelity to the Lord and His commandments, a trust in God and how one must always be combative in defending His word.  It’s not a work of history, something made clear in the first paragraph; this is a parable.

Judit decapitando a Holofernes (Judith Beheading Holofernes) (circa 1600) by Caravaggio (Michelangelo Merisi da Caravaggio, 1571–1610).

The facts of the climactic moment in the decapitation of General Holofernes are not in dispute, Judith at the appropriate moment drawing the general’s own sword, beheading him as he lay recumbent, passed out from too much drink.  Deed done, the assassin dropped the separated head in a leather basket and stole away.  The dramatic tale for centuries has attracted painters and sculptors, the most famous works created during the high Renaissance and Baroque periods and artists have tended to depict either Judith acting alone or in the company of her aged maid, a difference incidental to the murder but of some significance in the interpretation of preceding events.

Judit si presenta a Holofernes (Judith Presenting Herself to Holofernes) (circa 1724) by Antonio Gionima (1697–1732).

All agree the picturesque widow was able to gain access to the tent of Holofernes because of the general’s carnal desires but in the early centuries of Christianity, there’s little hint that Judith resorted to the role of seductress, only that she lured him to temptation, plied him with drink and struck.  The sexualization of the moment came later and little less controversial was the unavoidable juxtaposition of the masculine aggression of the blade-wielding killer with her feminine charms.  Given the premise of the tale and its moral imperative, the combination can hardly be avoided but it was for centuries disturbing to (male) theologians and priests, rarely at ease with bolshie women.  It was during the high Renaissance that artists began to vest Judith with an assertive sexuality (“from Mary to Eve” in the words of one critic), her features becoming blatantly beautiful, the clothing more revealing.  The Judith of the Renaissance and the Baroque appears one more likely to surrender her chastity to the cause where once she would have relied on guile and wine.

Judith (1928) by Franz von Stuck (1863–1928).

It was in the Baroque period that the representations more explicitly made possible the mixing of sex and violence in the minds of viewers, a combination that across media platforms remains today as popular as ever.  For centuries “Judith beheading Holofernes” was one of the set pieces of Western Art and there were those who explored the idea with references to David & Goliath (another example of the apparently weak decapitating the strong) or alluding to Salome, showing Judith or her maid carrying off the head in a basket.  The inventiveness proved not merely artistic because, in the wake of the ruptures caused by the emergent Protestant heresies, in the counter-attack by the Counter-Reformation, the parable was re-imagined in commissions issued by the Holy See, Judith’s blade defeating not only Assyrian oppression but all unbelievers, heretical Protestants just the most recently vanquished.  Twentieth century artists too have used Judith as a platform, predictably perhaps sometimes to show her as the nemesis of toxic masculinity and some have obviously enjoyed the idea of an almost depraved sexuality but there have been some quite accomplished versions.