Showing posts sorted by date for query Void. Sort by relevance Show all posts
Showing posts sorted by date for query Void. Sort by relevance Show all posts

Monday, June 30, 2025

Bunker

Bunker (pronounced buhng-ker)

(1) A large bin or receptacle; a fixed chest or box.

(2) In military use, historically a fortification set mostly below the surface of the ground with overhead protection provided by logs and earth or by concrete and fitted with above-ground embrasures through which guns may be fired.

(3) A fortification set mostly below the surface of the ground and used for a variety of purposes.

(4) In golf, an obstacle, classically a sand trap but sometimes a mound of dirt, constituting a hazard.

(5) In nautical use, to provide fuel for a vessel.

(6) In nautical use, to convey bulk cargo (except grain) from a vessel to an adjacent storehouse.

(7) In golf, to hit a ball into a bunker.

(8) To equip with or as if with bunkers.

(9) In military use, to place personnel or materiel in a bunker or bunkers (sometimes as “bunker down”).

1755–1760: From the Scottish bonkar (box, chest (also “seat” (in the sense of “bench”) of obscure origin but etymologists conclude the use related to furniture hints at a relationship with banker (bench).  Alternatively, it may be from a Scandinavian source such as the Old Swedish bunke (boards used to protect the cargo of a ship).  The meaning “receptacle for coal aboard a ship” was in use by at least 1839 (coal-burning steamships coming into general use in the 1820s).  The use to describe the obstacles on golf courses is documented from 1824 (probably from the extended sense “earthen seat” which dates from 1805) but perhaps surprisingly, the familiar sense from military use (dug-out fortification) seems not to have appeared before World War I (1914-1918) although the structures so described had for millennia existed.  “Bunkermate” was army slang for the individual with whom one shares a bunker while the now obsolete “bunkerman” (“bunkermen” the plural”) referred to someone (often the man in charge) who worked at an industrial coal storage bunker.  Bunker & bunkerage is a noun, bunkering is a noun & verb, bunkered is a verb and bunkerish, bunkeresque, bunkerless & bunkerlike are adjectives; the noun plural is bunkers.

Just as ships called “coalers” were used to transport coal to and from shore-based “coal stations”, it was “oilers” which took oil to storage tanks or out to sea to refuel ships (a common naval procedure) and these STS (ship-to-ship) transfers were called “bunkering” as the black stuff was pumped, bunker-to-bunker.  That the coal used by steamships was stored on-board in compartments called “coal bunkers” led ultimately to another derived term: “bunker oil”.  When in the late nineteenth century ships began the transition from being fuelled by coal to burning oil, the receptacles of course became “oil bunkers” (among sailors nearly always clipped to “bunker”) and as refining processes evolved, the fuel specifically produced for oceangoing ships came to be called “bunker oil”.

Bunker oil is “dirty stuff”, a highly viscous, heavy fuel oil which is essentially the residue of crude oil refining; it’s that which remains after the more refined and volatile products (gasoline (petrol), kerosene, diesel etc) have been extracted.  Until late in the twentieth century, the orthodox view of economists was its use in big ships was a good thing because it was a product for which industry had little other use and, as essentially a by-product, it was relatively cheap.  It came in three flavours: (1) Bunker A: Light fuel oil (similar to a heavy diesel), (2) Bunker B: An oil of intermediate viscosity used in engines larger than marine diesels but smaller than those used in the big ships and (3) Bunker C: Heavy fuel oil used in container ships and such which use VLD (very large displacement), slow running engines with a huge reciprocating mass.  Because of its composition, Bucker C especially produced much pollution and although much of this happened at sea (unseen by most but with obvious implications), when ships reached harbor to dock, all the smoke and soot became obvious.  Over the years, the worst of the pollution from the burning of bunker oil greatly has been reduced (the work underway even before the Greta Thunberg (b 2003) era), sometimes by the simple expedient of spraying a mist of water through the smoke.

Floor-plans of the upper (Vorbunker) and lower (Führerbunker) levels of the structure now commonly referred to collectively as the Führerbunker.

History’s most infamous bunker remains the Berlin Führerbunker in which Adolf Hitler (1889-1945; Führer (leader) and German head of government 1933-1945 & head of state 1934-1945) spent much of the last few months of his life.  In the architectural sense there were a number of Führerbunkers built, one at each of the semi-permanent Führerhauptquartiere (Führer Headquarters) created for the German military campaigns and several others built where required but it’s the one in Berlin which is remembered as “the Führerbunker”.  Before 1944 when the intensification of the air raids by the RAF (Royal Air Force) and USAAF (US Army Air Force) the term Führerbunker seems rarely to have been used other than by the architects and others involved in their construction and it wasn’t a designation like Führerhauptquartiere which the military and other institutions of state shifted between locations (rather as “Air Force One” is attached not to a specific airframe but whatever aircraft in which the US president is travelling).  In subsequent historical writing, the term Führerbunker tends often to be applied to the whole, two-level complex in Berlin and although it was only the lower layer which officially was designated as that, for most purposes the distinction is not significant.  In military documents, after January, 1945 the Führerbunker was referred to as Führerhauptquartiere.

Führerbunker tourist information board, Berlin, Germany.

Only an information board at the intersection of den Ministergärten and Gertrud-Kolmar-Straße, erected by the German Goverment in 2006 prior to that year's FIFA (Fédération Internationale de Football Association (International Federation of Association Football)) World Cup now marks the place on Berlin's Wilhelmstrasse 77 where once the Führerbunker was located.  The Soviet occupation forces razed the new Reich Chancellery and demolished all the bunker's above-ground structures but the subsequent GDR (Deutsche Demokratische Republik (German Democratic Republic; the old East Germany) 1949-1990) abandoned attempts completely to destroy what lay beneath.  Until after the fall of the Berlin Wall (1961-1989) the site remained unused and neglected, “re-discovered” only during excavations by property developers, the government insisting on the destruction on whatever was uncovered and, sensitive still to the spectre of “Neo-Nazi shrines”, for years the bunker’s location was never divulged, even as unremarkable buildings (an unfortunate aspect of post-unification Berlin) began to appear on the site.  Most of what would have covered the Führerbunker’s footprint is now a supermarket car park.

The first part of the complex to be built was the Vorbunker (upper bunker or forward bunker), an underground facility of reinforced concrete intended only as a temporary air-raid shelter for Hitler and his entourage in the old Reich Chancellery.  Substantially completed during 1936-1937, it was until 1943 listed in documents as the Luftschutzbunker der Reichskanzlei (Reich Chancellery Air-Raid Shelter), the Vorbunker label applied only in 1944 when the lower level (the Führerbunker proper) was appended.  In mid January, 1945, Hitler moved into the Führerbunker and, as the military situation deteriorated, his appearances above ground became less frequent until by late March he rarely saw the sky,  Finally, on 30 April, he committed suicide.

Bunker Busters

Northrop Grumman publicity shot of B2-Spirit from below, showing the twin bomb-bay doors through which the GBU-57 are released.

Awful as they are, there's an undeniable beauty in the engineering of some weapons and it's unfortunate humankind never collectively has resolved exclusively to devote such ingenuity to stuff other than us blowing up each other.  That’s not a new sentiment, being one philosophers and others have for millennia expressed in various ways although since the advent of nuclear weapons, concerns understandably become heightened.  Like every form of military technology ever deployed, once the “genie is out of the bottle” the problem is there to be managed and at the dawn of the atomic age, delivering a lecture in 1936, the British chemist and physicist Francis Aston (1877–1945) (who created the mass spectrograph, winning the 1922 Nobel Prize in Chemistry for his use of it to discover and identify the isotopes in many non-radioactive elements and for his enunciation of the whole number rule) observed:

There are those about us who say that such research should be stopped by law, alleging that man's destructive powers are already large enough.  So, no doubt, the more elderly and ape-like of our ancestors objected to the innovation of cooked food and pointed out the great dangers attending the use of the newly discovered agency, fire.  Personally, I think there is no doubt that sub-atomic energy is available all around us and that one day man will release and control its almost infinite power.  We cannot prevent him from doing so and can only hope that he will not use it exclusively in blowing up his next door neighbor.

The use in June 2025 by the USAF (US Air Force) of fourteen of its Boeing GBU-57 (Guided Bomb Unit-57) Massive Ordnance Penetrators (MOP) bombs against underground targets in Iran (twelve on the Fordow Uranium Enrichment Plant and two on the Natanz nuclear facility) meant “Bunker Buster” hit the headlines.  Carried by the Northrop B-2 Spirit heavy bomber (built between 1989-2000), the GBU-57 is a 14,000 kg (30,000 lb) bomb with a casing designed to withstand the stress of penetrating through layers of reinforced concrete or thick rock.  “Bunker buster” bombs have been around for a while, the ancestors of today’s devices first built for the German military early in World War II (1939-1945) and the principle remains unchanged to this day: up-scaled armor-piercing shells.  The initial purpose was to produce a weapon with a casing strong enough to withstand the forces imposed when impacting reinforced concrete structures, the idea simple in that what was needed was a delivery system which could “bust through” whatever protective layers surrounded a target, allowing the explosive charge to do damage where needed rtaher than wastefully being expended on an outer skin.  The German weapons proved effective but inevitably triggered an “arms race” in that as the war progressed, the concrete layers became thicker, walls over 2 metres (6.6 feet) and ceilings of 5 (16) being constructed by 1943.  Technological development continued and the idea extended to rocket propelled bombs optimized both for armor-piercing and aerodynamic efficiency, velocity a significant “mass multiplier” which made the weapons still more effective.

USAF test-flight footage of Northrop B2-Spirit dropping two GBU-57 "Bunker Buster" bombs.

Concurrent with this, the British developed the first true “bunker busters”, building on the idea of the naval torpedo, one aspect of which was in exploding a short distance from its target, it was highly damaging because it was able to take advantage of one of the properties of water (quite strange stuff according to those who study it) which is it doesn’t compress.  What that meant was it was often the “shock wave” of the water rather than the blast itself which could breach a hull, the same principle used for the famous “bouncing bombs” used for the RAF’s “Dambuster” (Operation Chastise, 17 May 1943) raids on German dams.  Because of the way water behaved, it wasn’t necessary to score the “direct hit” which had been the ideal in the early days of aerial warfare.

RAF Bomber Command archive photograph of Avro Lancaster (built between 1941-1946) in flight with Grand Slam mounted (left) and a comparison of the Tallboy & Grand Slam (right), illustrating how the latter was in most respects a scaled-up version of the former.  To carry the big Grand Slams, 32 “B1 Special” Lancasters were in 1945 built with up-rated Rolls-Royce Merlin V12 engines, the removal of the bomb doors (the Grand Slam carried externally, its dimensions exceeding internal capacity), deleted front and mid-upper gun turrets, no radar equipment and a strengthened undercarriage.  Such was the concern with weight (especially for take-off) that just about anything non-essential was removed from the B1 Specials, even three of the four fire axes and its crew door ladder.  In the US, Boeing went through a similar exercise to produce the run of “Silverplate” B-29 Superfortresses able to carry the first A-bombs used in August, 1945. 

Best known of the British devices were the so called earthquake bombs”, the Tallboy (12,000 lb; 5.4 ton) & Grand Slam (22,000 lb, 10 ton) which, despite the impressive bulk, were classified by the War Office as “medium capacity”.  The terms “Medium Capacity” (MC) & “High Capacity” referenced not the gross weight or physical dimensions but ratio of explosive filler to the total weight of the construction (ie how much was explosive compared to the casing and ancillary components).  Because both had thick casings to ensure penetration deep into hardened targets (bunkers and other structures encased in rock or reinforced concrete) before exploding, the internal dimensions accordingly were reduced compared with the ratio typical of contemporary ordinance.  A High Capacity (HC) bomb (a typical “general-purpose bomb) had a thinner casing and a much higher proportion of explosive (sometimes over 70% of total weight).  These were intended for area bombing (known also as “carpet bombing”) and caused wide blast damage whereas the Tallboy & Grand Slam were penetrative with casings optimized for aerodynamic efficiency, their supersonic travel working as a mass-multiplier.  The Tallboy’s 5,200 lb (2.3 ton) explosive load was some 43% of its gross weight while the Grand Slam’s 9,100 lb (4 ton) absorbed 41%; this may be compared with the “big” 4000 lb (1.8 ton) HC “Blockbuster” which allocated 75% of the gross weight to its 3000 LB (1.4 ton) charge.  Like many things in engineering (not just in military matters) the ratio represented a trade-off, the MC design prioritizing penetrative power and structural destruction over blast radius.  The novelty of the Tallboy & Grand Slam was that as earthquake bombs, their destructive potential was able to be unleashed not necessarily by achieving a direct hit on a target but by entering the ground nearby, the explosion (1) creating an underground cavity (a camouflet) and (2) transmitting a shock-wave through the target’s foundations, leading to the structure collapsing into the newly created lacuna. 

The etymology of camouflet has an interesting history in both French and military mining.  Originally it meant “a whiff of smoke in the face (from a fire or pipe) and in figurative use it was a reference to a snub or slight insult (something unpleasant delivered directly to someone) and although the origin is murky and it may have been related to the earlier French verb camoufler (to disguise; to mask) which evolved also into “camouflage”.  In the specialized military jargon of siege warfare or mining (sapping), over the seventeen and nineteenth centuries “camouflet” referred to “an underground explosion that does not break the surface, but collapses enemy tunnels or fortifications by creating a subterranean void or shockwave”.  The use of this tactic is best remembered from the Western Front in World War I, some of the huge craters now tourist attractions.

Under watchful eyes: Grand Ayatollah Ali Khamenei (b 1939; Supreme Leader, Islamic Republic of Iran since 1989) delivering a speech, sitting in front of the official portrait of the republic’s ever-unsmiling founder, Grand Ayatollah Ruhollah Khomeini (1900-1989; Supreme Leader, Islamic Republic of Iran, 1979-1989).  Ayatollah Khamenei seemed in 1989 an improbable choice as Supreme Leader because others were better credentialed but though cautious and uncharismatic, he has proved a great survivor in a troubled region.

Since aerial bombing began to be used as a strategic weapon, of great interest has been the debate over the BDA (battle damage assessment) and this issue emerged almost as soon as the bunker buster attack on Iran was announced, focused on the extent to which the MOPs had damaged the targets, the deepest of which were concealed deep inside a mountain.  BDA is a constantly evolving science and while satellites have made analysis of surface damage highly refined, it’s more difficult to understand what has happened deep underground.  Indeed, it wasn’t until the USSBS (United States Strategic Bombing Survey) teams toured Germany and Japan in 1945-1946, conducting interviews, economic analysis and site surveys that a useful (and substantially accurate) understanding emerged of the effectiveness of bombing although what technological advances have allowed for those with the resources is the so-called “panacea targets” (ie critical infrastructure and such once dismissed by planners because the required precision was for many reasons rarely attainable) can now accurately be targeted, the USAF able to drop a bomb within a few feet of the aiming point.  As the phrase is used by the military, the Fordow Uranium Enrichment Plant is as classic “panacea target” but whether even a technically successful strike will achieve the desired political outcome remains to be seen.

Mr Trump, in a moment of exasperation, posted on Truth Social of Iran & Israel: “We basically have two countries that have been fighting so long and so hard that they don't know what the fuck they're doing."  Actually, both know exactly WTF they're doing; it's just Mr Trump (and many others) would prefer they didn't do it.

Donald Trump (b 1946; US president 2017-2021 and since 2025) claimed “total obliteration” of the targets while Grand Ayatollah Khamenei admitted only there had been “some damage” and which is closer to the truth should one day be revealed.  Even modelling of the effects has probably been inconclusive because the deeper one goes underground, the greater the number of variables in the natural structure and the nature of the internal built environment will also influence blast behaviour.  All experts seem to agree much damage will have been done but what can’t yet be determined is what has been suffered by the facilities which sit as deep as 80 m (260 feet) inside the mountain although, as the name implies, “bunker busters” are designed for buried targets and it’s not always required for blast directly to reach target.  Because the shock-wave can travel through earth & rock, the effect is something like that of an earthquake and if the structure sufficiently is affected, it may be the area can be rendered geologically too unstable again to be used for its original purpose.

Within minutes of the bombing having been announced, legal academics were being interviewed (though not by Fox News) to explain why the attacks were unlawful under international law and in a sign of the times, the White House didn't bother to discuss fine legal points like the distinction between "preventive & pre-emptive strikes", preferring (like Fox News) to focus on the damage done.  However, whatever the murkiness surrounding the BDA, many analysts have concluded that even if before the attacks the Iranian authorities had not approved the creation of a nuclear weapon, this attack will have persuaded them one is essential for “regime survival”, thus the interest in both Tel Aviv and (despite denials) Washington DC in “regime change”.  The consensus seems to be Grand Ayatollah Khamenei had, prior to the strike, not ordered the creation of a nuclear weapon but that all energies were directed towards completing the preliminary steps, thus the enriching of uranium to ten times the level required for use in power generation; the ayatollah liked to keep his options open.  So, the fear of some is the attacks, even if they have (by weeks, months or years) delayed the Islamic Republic’s work on nuclear development, may prove counter-productive in that they convince the ayatollah to concur with the reasoning of every state which since 1945 has adopted an independent nuclear deterrent (IND).  That reasoning was not complex and hasn’t changed since first a prehistoric man picked up a stout stick to wave as a pre-lingual message to potential adversaries, warning them there would be consequences for aggression.  Although a theocracy, those who command power in the Islamic Republic are part of an opaque political institution and in the struggle which has for sometime been conducted in anticipation of the death of the aged (and reportedly ailing) Supreme Leader, the matter of “an Iranian IND” is one of the central dynamics.  Many will be following what unfolds in Tehran and the observers will not be only in Tel Aviv and Washington DC because in the region and beyond, few things focus the mind like the thought of ayatollahs with A-Bombs.

Of the word "bust"

The Great Bust: The Depression of the Thirties (1962) by Jack Lang (left), highly qualified porn star Busty Buffy (b 1996, who has never been accused of misleading advertising, centre) and The people's champion, Mr Lang, bust of Jack Lang, painted cast plaster by an unknown artist, circa 1927, National Portrait Gallery, Canberra, Australia (right).  Remembered for a few things, Jack Lang (1876–1975; premier of the Australian state of New South Wales (NSW) 1925-1927 & 1930-1932) remains best known for having in 1932 been the first head of government in the British Empire to have been sacked by the Crown since William IV (1765–1837; King of the UK 1830-1837) in 1834 dismissed Lord Melbourne (1779–1848; prime minister of the UK 1834 & 1835-1841).

Those learning English must think it at least careless things can both be (1) “razed to the ground” (totally to destroy something (typically a structure), usually by demolition or incineration) and (2) “raised to the sky” (physically lifted upwards).  The etymologies of “raze” and “raise” differ but they’re pronounced the same so it’s fortunate the spellings vary but in other troublesome examples of unrelated meanings, spelling and pronunciation can align, as in “bust”.  When used in ways most directly related to human anatomy: (1) “a sculptural portrayal of a person's head and shoulders” & (2) “the circumference of a woman's chest around her breasts” there is an etymological link but these uses wholly are unconnected with bust’s other senses.

Bust of Lindsay Lohan in white marble by Stable Diffusion.  Sculptures of just the neck and head came also to be called “busts”, the emphasis on the technique rather than the original definition.

Bust in the sense of “a sculpture of upper torso and head” dates from the 1690s and was from the sixteenth century French buste, from the Italian busto (upper body; torso), from the Latin bustum (funeral monument, tomb (although the original sense was “funeral pyre, place where corpses are burned”)) and it may have emerged (as a shortened form) from ambustum, neuter of ambustus (burned around), past participle of amburere (burn around, scorch), the construct being ambi- (around) + urere (to burn),  The alternative etymology traces a link to the Old Latin boro, the early form of the Classical Latin uro (to burn) and it’s though the development in Italian was influenced by the Etruscan custom of keeping the ashes of the dead in an urn shaped like the person when alive.  Thus the use, common by the 1720s of bust (a clipping from the French buste) being “a carving of the “trunk of the human body from the chest up”.  From this came the meaning “dimension of the bosom; the measurement around a woman's body at the level of her breasts” and that evolved on the basis of a comparison with the sculptures, the base of which was described as the “bust-line”, the term still used in dress-making (and for other comparative purposes as one of the three “vital statistics” by which women are judged (bust, waist, hips), each circumference having an “ideal range”).  It’s not known when “bust” and “bust-line” came into oral use among dress-makers and related professions but it’s documented since the 1880s.  Derived forms (sometimes hyphenated) include busty (tending to bustiness, thus Busty Buffy's choice of stage-name), overbust & underbust (technical terms in women's fashion referencing specific measurements) and bustier (a tight-fitting women's top which covers (most or all of) the bust.

Benito Mussolini (1883-1945; Duce (leader) & Prime-Minister of Italy 1922-1943) standing beside his “portrait bust” (1926).

The bust was carved by Swiss sculptor Ernest Durig (1894–1962) who gained posthumous notoriety when his career as a forger was revealed with the publication of his drawings which he’d represented as being from the hand of the French sculptor Auguste Rodin (1840-1917) under whom he claimed to have studied.  Mussolini appears here in one of the subsequently much caricatured poses which were a part of his personality cult.  More than one of the Duce's counterparts in other nations was known to have made fun of some of the more outré poses and affectations, the outstretched chin, right hand braced against the hip and straddle-legged stance among the popular motifs. 

“Portrait bust” in marble (circa 1895) of (1815-1989; chancellor of the German Empire (the "Second Reich") 1871-1890) by the German Sculptor Reinhold Begas (1831-1911).

 In sculpture, what had been known as the “portrait statue” came after the 1690s to be known as the “portrait bust” although both terms meant “sculpture of upper torso and head” and these proved a popular choice for military figures because the aspect enabled the inclusion of bling such as epaulettes, medals and other decorations and being depictions of the human figure, busts came to be vested with special significance by the superstitious.  In early 1939, during construction of the new Reich Chancellery in Berlin, workmen dropped one of the busts of Otto von Bismarck by Reinhold Begas, breaking it at the neck.  For decades, the bust had sat in the old Chancellery and the building’s project manager, Albert Speer (1905–1981; Nazi court architect 1934-1942; Nazi minister of armaments and war production 1942-1945), knowing Adolf Hitler (1889-1945; Führer (leader) and German head of government 1933-1945 & head of state 1934-1945) believed the Reich Eagle toppling from the post-office building right at the beginning of World War I had been a harbinger of doom for the nation, kept the accident secret, hurriedly issuing a commission to the German sculptor Arno Breker (1900–1991) who carved an exact copy.  To give the fake the necessary patina, it was soaked for a time in strong, black tea, the porous quality of marble enabling the fluid to induce some accelerated aging.  Interestingly, in his (sometimes reliable) memoir (Erinnerungen (Memories or Reminiscences) and published in English as Inside the Third Reich (1969)), even the technocratic Speer admitted of the accident: “I felt this as an evil omen”.

The other senses of bust (as a noun, verb & adjective) are diverse (and sometimes diametric opposites and include: “to break or fail”; “to be caught doing something unlawful / illicit / disgusting etc”; “to debunk”; “dramatically or unexpectedly to succeed”; “to go broke”; “to break in (horses, girlfriends etc): “to assault”; the downward portion of an economic cycle (ie “boom & bust”); “the act of effecting an arrest” and “someone (especially in professional sport) who failed to perform to expectation”.  That’s quite a range and that has meant the creation of dozens of idiomatic forms, the best known of which include: “boom & bust”, “busted flush”, “dambuster”, “bunker buster”,  “busted arse country”, “drug bust”, “cloud bust”, belly-busting, bust one's ass (or butt), bust a gut, bust a move, bust a nut, bust-down, bust loose, bust off, bust one's balls, bust-out, sod buster, bust the dust, myth-busting and trend-busting.  In the sense of “breaking through”, bust was from the Middle English busten, a variant of bursten & bresten (to burst) and may be compared with the Low German basten & barsten (to burst).  Bust in the sense of “break”, “smash”, “fail”, “arrest” etc was a creation of mid-nineteenth century US English and is of uncertain inspiration but most etymologists seem to concur it was likely a modification of “burst” effected with a phonetic alteration but it’s not impossible it came directly as an imperfect echoic of Germanic speech.  The apparent contradiction of bust meaning both “fail” and “dramatically succeed” happened because the former was an allusion to “being busted” (ie broken) while the latter meaning used the notion of “busting through”.

Tuesday, December 3, 2024

Sable

Sable (pronounced sey-buhl)

(1) An Old World, small, carnivorous, weasel-like mammal, Mustela zibellina, of cold regions in Eurasia and the North Pacific Islands, valued for its fur which exists in shades of brown.  They are solitary & arboreal, with a diet largely of eat small animals and eggs.

(2) A marten, especially the Mustela americana & Martes zibellina.

(3) The fur of the sable.

(4) A garment made from sable (as descriptor or modifier)

(5) An artist's brush made from the fur of the sable.

(6) A type of French biscuit of a sandy texture and made with butter, sugar, eggs & flour.

(7) The stage name of Rena Marlette-Lesnar (née Greek, formerly Mero; b 1968), a US model & actress, best known for her career (1996-1999 & 2003-2004) as a professional wrestler.

(8) The color black, especially when in heraldic use.

(9) The color of sable fur (a range from yellowish-brown to dark brown).

(10) A locality name in North America including (1) a cape at the southern Florida (the southern-most point of the continental US and (2) the southernmost point of Nova Scotia, Canada.

(11) In the plural (as sables), black garments worn in mourning.

(12) In literary use, dark-skinned; black (archaic when used of people but used still in other contexts).

(13) In figurative use, a “black” or “dark” mood; gloominess (now rare).

1275–1325: From the Middle English sable, saibel, sabil & sabille (a sable, pelt of a sable; (the color) black), from the Old French sable, martre sable & saibile (a sable, sable fur), from the Medieval Latin sabelum & sabellum (sable fur), from the Middle Low German sabel (the Middle Dutch was sabel and the late Old High German was zobel), from a Slavic or Baltic source and related to the Russian со́боль (sóbol), the Polish soból, the Czech sobol, the Lithuanian sàbalas and the Middle Persian smwl (samōr).  Sable is a noun & adjective; the noun plural is sables or sable.

The modern funeral: @edgylittlepieces take on the sable.  Their funeral dress included a mode in which it could be “tightened up to make it super modest for the funeral”, later to be “loosened back down for the after-party.”  The promotional clip attracted many comments, some of which indicated scepticism about whether funerals had “after-parties” but the wake is a long-established tradition.  Wake (in this context) was from the Middle English wake, from the Old English wacu (watch), from the Proto-Germanic wakō and wakes could be held before or after the funeral service, depending on local custom.  In James Joyce's (1882–1941) Finnegans Wake (1939), Tim Finnegan's wake occurs before the funeral service so the young lady would have “loosened” first before “tightening” into “super modest” mode for the ceremony.  “Modest” is of course a relative term and it's literature's loss Joyce never had the chance to write about this sable although how he'd have interpolated it into the narrative of Finnegans Wake is anyone's guess but fragments from the text such as “…woven of sighed sins and spun of the dulls of death…” and “…twisted and twined and turned among the crisscross, kisscross crooks and connivers, the curtaincloth of a crater let down, a sailor’s shroud of turfmantle round the pulpit...” lend a hint.

In Western culture black is of course the color of mourning so funeral garments came to be known as “sables” but the curious use of sable to mean “black” (in heraldry, for other purposes and in figurative use) when all known sables (as in the weasel-like mammal) have been shades of brown (albeit some a quite dark hue) attracted various theories including (1) the pelt of another animal with black fur might have been assumed to be a sable, (2) there may in some places at some time have been a practice of dying sable pelts black or (3) the origin of the word (as a color) may be from an unknown source.  It was used as an adjective from the late fourteenth century and in the same era came to be used as a term emblematic of mourning or grief, soon used collectively of black “mourning garments”.  In the late eighteenth century it was used of Africans and their descendants (ie “black”) although etymologists seem divided whether this was originally a “polite” form or one of “mock dignity”.

AdVintage's color chart (left) and a Crusader Fedora hat in True-Sable with 38mm wide, black-brown grosgrain ribbon, handcrafted from Portuguese felt (right).

The phrase “every cloud has a silver lining” was in general use by the early nineteenth century and is used to mean even situations which seem bad will have some positive aspect and thus a potential to improve.  That’s obviously not true and many are probably more persuaded by the derivative companion phrase coined by some unknown realist: “Every silver lining has a cloud” (ie every good situation has the potential to turn bad and likely will).  Every cloud has a silver lining” dates from the seventeenth century and it entered popular use after the publication of John Milton’s (1608–1674) masque Comus (1634) in which the poet summoned the imagery of a dark & threatening cloud flowing at the edges with the moon’s reflected light of the moon, symbolizing hope in adversity:

I see ye visibly, and now believe
That he, the Supreme Good, to whom all things ill
Are but as slavish officers of vengeance,
Would send a glistering guardian, if need were
To keep my life and honor unassailed.
Was I deceived, or did a sable cloud
Turn forth her silver lining on the night?
I did not err; there does a sable cloud
Turn forth her silver lining on the night,
And casts a gleam over this tufted grove.


Who wore the sable-trimmed coat better?  The Luffwaffe's General Paul Conrath (1896–1979, left) with Hermann Göring (1893–1946; leading Nazi 1922-1945, Hitler's designated successor & Reichsmarschall 1940-1945, centre), Soviet Union, 1942 and Lindsay Lohan at New York Fashion Week, September 2024.  

Given modern sensibilities, Ms Lohan's “sable” presumably was faux fur and appeared to be the coat's collar rather than a stole but the ensemble was anyway much admired.  Count Galeazzo Ciano (1903–1944; Italian foreign minister 1936-1943) wasn’t an impartial observer of anything German but he had a diarist’s eye and left a vivid description of the impression the Reichsmarschall made during his visit to Rome in 1942: “At the station, he wore a great sable coat, something between what motorists wore in 1906 and what a high grade prostitute wears to the opera.”  Ciano was the son-in-law of Benito Mussolini (1883-1945; Duce (leader) & prime-minister of Italy 1922-1943) who later ordered his execution, a power doubtlessly envied by many fathers-in-law.

1996 Mercury Sable.  The styling of the third generation Sable (and the Ford Taurus) was upon its release controversial and, unlike some other designs thought “ahead of their time”, few have warmed to it.  To many, when new, it looked like something which had been in an accident and was waiting to be repaired.

Over five generations (1986–1991; 1992–1995; 1996–1999; 2000–2005 & 2008–2009), the Ford Motor Company (FoMoCo) produced the Mercury Sable, a companion (and substantially “badge-engineered”) version of the Ford Taurus (discontinued in the US in 2016 but still available in certain overseas markets).  Dreary and boring the FWD (front wheel drive) Taurus & Sable may have been but they were well-developed and appropriate to the needs of the market so proved a great success.  The Mercury brand had been introduced in 1939 to enable the corporation better to service the “medium-priced” market, its approach until then constrained by the large gap (in pricing & perception) between Fords and Lincolns; at the time, General Motors’ (GM) “mid-range” offerings (ie LaSalle, Buick, Oldsmobile & Pontiac (which sat between Chevrolet & Cadillac)) collectively held almost a quarter of the US market.  Given the structure of the industry (limited product ranges per brand) at the time it was a logical approach and one which immediately was successful although almost simultaneously, Ford added the up-market “Ford De Luxe” while Lincoln introduced the “Lincoln Zephyr” at a price around a third what was charged for the traditional Lincoln range.  It was a harbinger of what was to come in later decades when product differentiation became difficult to maintain as Ford increasingly impinged on Mercury’s nominal territory.  After years of decline, Ford took the opportunity offered by the GFC (Global Financial Crisis, 2008-2011) and in 2010 closed-down the Mercury brand.

Midler v. Ford Motor Co., 849 F.2d 460 (Ninth Circuit Federal Courts of Appeal, 1988)

Apart from the odd highlight like the early Cougars (1967-1970), Mercury is now little remembered and the Sable definitely forgotten but it does live on as a footnote in legal history which, since the rise of AI (Artificial Intelligence), has been revisited because of the advertising campaign which accompanied the Sable’s launch in 1996.  The case in which the Sable featured dates from 1988 and was about the protectability (at law) of the voice of a public figure (however defined) and the right of an individual to prevent commercial exploitation of their “unique and distinctive sound” without consent.  FoMoCo and its advertising agency (Young & Rubicam Inc (Y&R)) in 1985 aired a series of 30 & 60 second television commercials (in what the agency called “The Yuppie Campaign”, the rationale of which was to evoke in the minds of the target market (30 something urban professionals in a certain income bracket) memories of their hopefully happy (if often impoverished) days at university some fifteen years earlier.  To achieve the effect, a number of popular songs of the 1970s were used for the commercials and in some cases the original artists licenced the material but ten declined to be involved so Y&R hired “sound-alikes” who re-recorded the material.  One who rejected Y&R’s offer was the singer Bette Midler (b 1945).

Sable (the stage name of Rena Marlette-Lesnar (née Greek, formerly Mero; b 1968)); promotional photograph issued by WWE (World Wrestling Entertainment) to which she was contracted.

Y&R had from the copyright holder secured a licence to use the song, Do You Want to Dance which Ms Midler had interpreted on her debut album The Divine Miss M (1972) and neither her name nor an image of her appeared in the commercial.  Y&R’s use of the song was under the terms of settled law; the case hung on whether Ms Midler had the right to protect her voice from commercial exploitation by means of imitation.  At trial, the district court described the defendants' conduct as that “...of the average thief...” (“If we can't buy it, we'll take it”) but held there was no precedent establishing a legal principle preventing imitation of Midler's voice and thus gave summary judgment for the defendants.  Ms Midler appealed.

Years before, a federal court had held the First Amendment (free speech) to the US constitution operated with a wide latitude in protecting reproduction of likenesses or sounds, finding the “use of a person's identity” was central; if the purpose was found to be “informative or cultural”, then the use was immune from challenge but if it “serves no such function but merely exploits the individual portrayed, immunity will not be granted.  Moreover, federal copyright law overlays such matters and the “...mere imitation of a recorded performance would not constitute a copyright infringement even where one performer deliberately sets out to simulate another's performance as exactly as possible.  So Ms Midler’s claim was novel in that it was unrelated to the copyrighted material (the song), thus excluding consideration of federal copyright law.   At the time, it was understood a “voice is not copyrightable” and what she was seeking to protect was something more inherently personal than any work of authorship.  There had been vaguely similar cases but they had been about “unfair competition” in which people like voice-over artists were able to gain protection from others emulating in this commercial area a voice, the characteristics of which the plaintiffs claimed to have “invented” or “defined” (the courts never differentiated).

On appeal, the court reversed the original judgment, holding it was not necessary to “…go so far as to hold that every imitation of a voice to advertise merchandise is actionable.  We hold only that when a distinctive voice of a professional singer is widely known and is deliberately imitated in order to sell a product, the sellers have appropriated what is not theirs and have committed a tort in California.  Midler has made a showing, sufficient to defeat summary judgment, that the defendants here for their own profit in selling their product did appropriate part of her identity.”  What this established was an individual's voice can be as integral to their identity as their image or name and that is reflected in recent findings about AI-generated voices that mimic specific individuals; they too can infringe on similar rights if used without consent, particularly for commercial or deceptive purposes.  The “AI generated voice” cases will for some time continue to appear in many jurisdictions and it’s not impossible some existing (and long-standing) contracts might be declared void for unconscionability on the grounds terms which once “signed away in perpetuity” rights to use a voice will no longer enforced because the technological possibilities now available could not have been envisaged.

Saturday, November 16, 2024

Parole

Parole (pronounced puh-rohl or pa-rawl (French))

(1) In penology, the (supervised) conditional release of an inmate from prison prior to the end of the maximum sentence imposed.

(2) Such a release or its duration.

(3) An official document authorizing such a release (archaic except as a modifier).

(4) In military use, the promise (usually in the form of a written certificate) of a prisoner of war, that if released they either will return to custody at a specified time or will not again take up arms against their captors.

(5) Any password given by authorized personnel in passing by a guard (archaic but still used in video gaming).

(6) In military use, a watchword or code phrase; a password given only to officers, distinguished from the countersign, given to all guards (archaic but still used in video gaming).

(7) A word of honor given or pledged (archaic).

(8) In US immigration legislation, the temporary admission of non-U.S. citizens into the US for emergency reasons or on grounds considered in the public interest, as authorized by and at the discretion of the attorney general.

(9) In structural linguistics, language as manifested in the individual speech acts of particular speakers (ie language in use, as opposed to language as a system).

(10) To place or release on parole.

(11) To admit a non-US citizen into the US as provided for in the parole clauses in statute.

(12) Of or relating to parole or parolees:

(13) A parole record (technical use only).

1610–1620: From the Middle French parole (word, formal promise) (short for parole d'honneur (word of honor)), from the Old French parole, from the Late Latin parabola (speech), from the Classical Latin parabola (comparison), from the Ancient Greek παραβολή (parabol) (a comparison; parable (literally “a throwing beside”, hence “a juxtaposition").  The verb was derived from the noun an appeared early in the eighteenth century; originally, it described “what the prisoner did” (in the sense of a “pledge”) but this sense has long been obsolete.  The transitive meaning “put on parole, allow to go at liberty on parole” was in use by the early 1780s while the use to refer to “release (a prisoner) on his own recognizance” doesn’t appear for another century.  The adoption in English was by the military in the sense of a “word of honor” specifically that given by a prisoner of war not to escape if allowed to go about at liberty, or not to take up arms again if allowed to return home while the familiar modern sense of “a (supervised) conditional release of a inmate before their full term is served” was a part of criminal slang by at least 1910.  An earlier term for a similar thing was ticket of leave.  In law-related use, parol is the (now rare) alternative spelling.  Parole is a noun & verb, parolee is a noun, paroled & paroling are verbs and parolable, unparolable, unparoled & reparoled are adjectives (hyphenated use is common); the noun plural is paroles.

A parole board (or parole authority, parole panel etc) is panel of people who decide whether a prisoner should be released on parole and if released, the parolee is placed for a period under the supervision of a parole officer (a law enforcement officer who supervises offenders who have been released from incarceration and, often, recommends sentencing in courts of law).  In some jurisdictions the appointment is styled as “probation officer”.  The archaic military slang pass-parole was an un-adapted borrowing from French passe-parole (password) and described an order passed from the front to the rear by word of mouth. Still sometimes used in diplomatic circles, the noun porte-parole (plural porte-paroles) describes “a spokesperson, one who speaks on another's behalf” and was an un-adapted borrowing from mid sixteenth century French porte-parole, from the Middle French porteparolle.

The Parole Evidence Rule

In common law systems, the parol evidence rule is a legal principle in contract law which restricts the use of extrinsic (outside) evidence to interpret or alter the terms of a written contract.  The operation of the parol evidence rule means that if two or more parties enter into a written agreement intended to be a complete and final expression of their terms, any prior or contemporaneous oral or written statements that contradict or modify the terms of that written agreement cannot be used in court to challenge the contract’s provisions.  The rule applies only to properly constructed written contracts which can be regarded as “final and complete written agreements” and the general purpose is to protect the integrity of the document.  Where a contract is not “held to be final and complete”, parol evidence may be admissible, including cases of fraud, misrepresentation, mistake, illegality or where the written contract is ambiguous.  The most commonly used exceptions are (1) Ambiguity (if a court declares a contract term ambiguous, external evidence may be introduced to to clarify the meaning), (2) Void or voidable contracts (if a contract was entered into under duress or due to fraud or illegality, parol evidence can be used to prove this.  In cases of mistakes, the scope is limited but it can still be possible), (3) Incomplete contracts (if a court determines a written document doesn’t reflect the full agreement between the parties, parol evidence may be introduced to “complete it”, (4) Subsequent agreements (modifications or agreements made after the written contract can generally be proven with parol evidence although in the narrow technical sense such additions may be found to constitute a “collateral contract”.

Parole & probation

Depending on the jurisdiction, “parole” & “probation” can mean much the same thing or things quite distinct, not helped by parolees in some places being supervised by “probation officers” and vice versa.

In the administration of criminal law, “parole” and “probation” are both forms of supervised release but between jurisdictions the terms can either mean the same thing or be applied in different situations.  As a general principle, parole is the conditional release of a prisoner before completing their full sentence and those paroled usually are supervised by a parole officer and must adhere to certain conditions such as regular meetings, drug testing and maintaining employment and certain residential requirements.  The purpose of parole is (1) a supervised reintegration of an inmate into society and (2) a reward for good behavior in prison.  Should a parolee violate the conditions of their release, they can be sent back to prison to serve the remainder of their sentence.  As the word typically is used, probation is a court-ordered period of supervision in the community instead of, or in addition to, a prison sentence.  A term of probation often imposed at sentencing, either as an alternative to incarceration or as a portion of the sentence after release.  Like parolees, individuals on probation are monitored, often by a probation officer (although they may be styled a “parole officer”) and are expected to follow specific conditions.  Probation is in many cases the preferred sentencing option for first offenders, those convicted of less serious offences and those for whom a custodial sentence (with all its implications) would probably be counter-productive.  It has the advantage also of reducing overcrowding in prisons and is certainly cheaper for the state than incarceration.  Those who violate the terms of their probation face consequences such as an extended probation or being sent to jail.  The word “parole” in this context was very much a thing of US English until the post-war years when it spread first to the UK and later elsewhere in the English-speaking world.

Langue & parole

In structural linguistics, the terms “langue” & “parole” were introduced by the groundbreaking Swiss semiotician Ferdinand de Saussure (1857-1913) and remain two of the fundamental concepts in the framework of structuralism and are treated as important building blocks in what subsequently was developed as the science of human speech.  Within the profession, “langue” & “parole” continue to be regarded as “French words” because the sense in that language better describes things than the English translations (“language” & “speech” respectively) which are “approximate but inadequate”.  Langue denotes the system (or totality) of language shared by the “collective consciousness” so it encompasses all elements of a language as well as the rules & conventions for their combination (grammar, spelling, syntax etc).  Parole is the use individuals make of the resources of language, which the system produces and combines in speech, writing or other means of transmission.  As de Saussure explained it, the conjunction and interaction of the two create an “antinomy of the social and shared”, a further antinomy implied in the idea that langae is abstract and parole is concrete.

The construct of the noun antinomy was a learned borrowing from the Latin antinom(ia) + the English suffix “-y” (used to form abstract nouns denoting a condition, quality, or state).  The Latin antinomia was from the Ancient Greek ντινομία (antinomía), the construct being ντι- (anti- (the prefix meaning “against”), ultimately from the primitive Indo-European hent- (face; forehead; front)) + νόμος (nómos) (custom, usage; law, ordinance) from  νέμω (némō) (to deal out, dispense, distribute), from the primitive Indo-European nem- (to distribute; to give; to take))  + -́ (-íā) (the suffix forming feminine abstract nouns).  The English word is best understood as anti- (in the sense of “against”) + -nomy (the suffix indicating a system of laws, rules, or knowledge about a body of a particular field).  In law, it was once used to describe “a contradiction within a law, or between different laws or a contradiction between authorities” (a now archaic use) but by extension it has come to be used in philosophy, political science and linguistics to describe “any contradiction or paradox”.  A sophisticated deconstruction of the concept was provided by the German German philosopher Immanuel Kant (1724–1804) who in Kritik der reinen Vernunft (Critique of Pure Reason (1781)) explained that apparent contradictions between valid conclusions (a paradox) could be resolved once it was understood the two positions came from distinct and exclusive sets, meaning no paradox existed, the perception of one merely the inappropriate application of an idea from one set to another.

So langue is what people use when thinking and conceptualizing (abstract) while parole what they use in speaking or writing (concrete), Saussure’s evaluative distinction explained as “The proper object of linguistic study is the system which underlies any particular human signifying human practice, not the individual utterance.” and the implication of that was that langue is of more importance than parole.  In the English-speaking world, it was the work of US Professor Noam Chomsky (b 1928) which made the concept of langue & parole well-known through his use of the more accessible terms “competence” & “performance”.  Chomsky’s latter day role as a public intellectual (though a barely broadcasted one in his home country) commenting on matters such as US foreign policy or the contradictions of capitalism has meant his early career in linguistics is often neglected by those not in the profession (the highly technical nature of the stuff does mean it’s difficult for most to understand) but his early work truly was revolutionary.

Noam Chomsky agitprop by Shepard Fairey (b 1970) on Artsy.

Chomsky used “competence” to refer to a speaker's implicit knowledge of the rules and principles of a language, something which permits them to understand and generate grammatically correct sentences which can be understood by those with a shared competence.  Competence is the idealized, internalized system of linguistic rules that underlies a speaker's ability to produce and comprehend language. It reflects one’s mental grammar, independent of external factors like memory limitations or social context.  Performance refers to the actual use of language IRL (in real life), influenced by psychological and physical factors such as memory, attention, fatigue, and social context.  Performance includes the errors, hesitations, and corrections that occur in everyday speech and Chomsky made the important point these do not of necessity reveal lack of competence.  Indeed, understood as “disfluencies”, (the “ums & ahs” etc) these linguistic phenomenon turned out to be elements it was essential to interpolate into the “natural language” models used to train AI (artificial intelligence) (ro)bots to create genuinely plausible “human analogues”.  Chomsky argued competence should be the primary domain of inquiry for theoretical linguistics and he focused on these abstract, universal principles in his early work which provoked debates which continue to this day.  Performance, subject to errors, variability and influenced by non-linguistic factors, he declared better studied by those in fields like sociolinguistics and psycholinguistics.