Showing posts with label Architecture. Show all posts
Showing posts with label Architecture. Show all posts

Monday, July 28, 2025

Ginger

Ginger (pronounced jin-jer)

(1) Any of several zingiberaceous plants of the genus Zingiber (especially Zingiber officinale of the East Indies), native to South Asia but now cultivated in many tropical countries and noted for the pungent, spicy rhizome used in cooking and medicine (Ginger is one of the oldest known “anti-seasickness treatments).

(2) The underground stem of this plant, used fresh or powdered as a flavouring or crystallized as a sweetmeat.

(3) The rhizome of Zingiber officinale, ground, chopped etc, used as a flavoring.

(4) In informal use, piquancy; animation; liveliness; vigour.

(5) A reddish-brown or yellowish-brown colour

(6) A female given name, form of Virginia or Regina (also used of red-headed men as a nickname).

(7) In zoology, a given name for animals having ginger- or orange-coloured fur or feathers.

(8) Flavored or made with ginger, the spicy rhizome of the Zingiber officinale plant.

(9) In informal use, someone with “red” hair (a range which includes the various shades of ginger).

(10) In cockney rhyming slang, a bit of a homosexual (based on “ginger beer” (ie “queer”)).

(11) In slang, Ginger ale, or can or bottle of such (especially if dry).

(12) In colloquial use in Scotland (prevalent especially in Glasgow), any fizzy soft drink, or can or bottle of such (especially the famous Iron Brew).

(13) To treat or flavor with ginger, the spicy rhizome of the Zingiber officinale plant (to add ginger to).

(14) In informal use, to impart piquancy or spirit to; enliven (usually in the form “ginger up”).

(15) As a regionalism, very careful or cautious (also, delicate; sensitive).

Pre 1000: From the Middle English gingere, an alteration of gingivere, from the Old English ginȝifer & ginȝiber (gingifer & gingiber) (influenced by Old French gingivre & gingembre), from the Medieval Latin gingiber & zingiber (the Latin zingiberi from the late Ancient Greek ζιγγίβερις (zingíberis)), from the Prakrit (Middle Indic) singabera, from the Sauraseni Prakrit śr̄ngaveram, the construct being śr̄nga- (horn) + vera- (body), an allusion to the typical shape of the plant’s root when harvested which may be compared with the Old Tamil iñcivēr and the Tamil இஞ்சிவேர் (iñcivēr), the construct being இஞ்சி (iñci) (ginger) + வேர் (vēr) (root)).  Not all etymologists agree with the orthodox derivation of śr̄ngaveram, suggesting it may be Sanskrit folk etymology and the word may be from an ancient Dravidian word that also produced the modern name for the spice used in the Tamil.  The dissidents argue the Tamil iñci must at some point have had an initial “ś” and the Sanskrit śṛṅgabera was an imitation of the (supposititious) Tamil ciñcivēr with the European zingiber coming from the Tamil name.  Ginger is a noun, gingerness & gingerliness are nouns, gingering is a verb, gingered is a verb & adjective, gingerish, singersome, gingerlike & gingerish are adjectives, gingerly is an adjective & adverb and gingerliness is an adverb; the noun plural is gingers. The adjectives ginger-free & gingerless are non-standard but have appeared on menus and in the software in restaurant PoS (point-of-sale systems).  The adjectives gingerer & gingerest do exist but are now so rare as to be archaic.

It’s believed the word re-entered Middle English under the influence of twelfth century Old French gingibre (which in Modern French endures as gingembre).  As a reference to coloring, the first recorded use was of fighting cocks, dating from 1785, extended to persons exactly a century later (although of hair alone it was used thus in the 1850s).  The sense of “spirit, spunk, temper” was a creation of mid nineteenth century US English. Ginger-ale was first advertised in the early 1820s, the term adopted by manufacturers to distinguish their product from ginger beer (on sale since 1809 and the central exhibit in Donoghue v Stevenson [1932] AC 562, a landmark case in tort law, heard before the House of Lords) which sometimes was fermented.  The ginger-snap was a hard cookie (biscuit in UK use) flavored with ginger, the product on sale by at least 1855.

Arnott’s Ginger Nuts.

In various forms and sold under several names (ginger-snap, ginger biscuit, ginger cookie, gingernut etc), ginger snaps are one of the planet’s most popular cookies (biscuits) and while ginger (usually powdered because it’s most suited to the industrial production of food) obviously is the common flavoring, other ingredients sometimes used include cinnamon, molasses and cloves.  The recipes vary although all tend to produce hard, brittle cookies and are much favoured by those who lie to dunk the things in their tea or coffee (softening it) which does seem to defeat the purpose but dunking really is a thing.  Between countries ginger-snaps differ greatly but even within markets there are culinary regionalisms: The Griffin’s Gingernut is New Zealand’s biggest selling biscuit and the whole country is supplied using the same recipe but in Australia, Arnott’s Ginger Nuts vary in size, color, hardness and taste between states and that was not a deliberate corporate decision but the product of M&A (mergers & acquisitions) activities beginning in the 1960s when the Arnott’s Group was created, a number of previously independent local bakeries absorbed; fearing a revolt, it was decided to retain the long-established recipes.  All Ginger Nut biscuits are sold in 250g packages but while WA (Western Australia), SA (South Australia) and the NT (Northern Territory) share a common “sweet” mixture, those living in Victoria and Tasmania enjoy an even sweeter flavour (closer to similar biscuits sold overseas which are both larger and softer in texture).  In NSW (New South Wales) and the ACT (Australian Capital Territory) a “thick and hard” Ginger Nut is sold and Queensland (always different) enjoys a unique “thin, sweet and dark” product.  Arnott’s also revealed as well as differences in the mix, the baking time varies between varieties, accounting for the color and hardness.  For those wishing to make comparisons, there’s a choice of comparatives: (“more ginger” or (the rare) “gingerer” and superlatives: “most ginger” or (the rare) “gingerest”.

Lindsay Lohan (b 1968) and her sister Ali (b 1993) making gingerbread houses on the Drew Barrymore (b 1975) Show (CBS Media Ventures), November, 2022.

The noun gingerbread was from the late thirteenth century gingerbrar (preserved ginger), from the Old French ginginbrat (ginger preserve), from the Medieval Latin gingimbratus (gingered,) from gingiber.  It was folk etymology which changed the ending to -brede (bread) and in that form the word was in use by the mid-1300s; by the fifteenth century it had come to mean “sweet cake spiced with ginger” although the still popular confection “gingerbread man” wasn’t known until circa 1850.  The figurative use (indicating anything thought fussy, showy or insubstantial) can be regarded a sort of proto-bling and emerged around the turn of the seventeenth century; in domestic architecture or interior decorating it was used as a critique by at least the late 1750s, use possibly influenced by the earlier “gingerbread-work” which was sailor’s slang for the often elaborately carved timberwork on ships.  Bling not then being in use, the term “gingerbread” often was used of the increasingly rococoesque detailing being applied to US cars by the late 1950s and it was revived as the interiors became “fitted out” in the 1970s although stylists (they weren’t yet “designers”) preferred “gorp”.  Decades before, as a noun, becoming Detroit styling studio slang, gorp was (as a verb) defined as meaning “greedily to eat” and it’s believed the alleged acronyms “good old raisins and peanuts” & “granola, oats, raisins, peanuts” are probably backronyms.  What the stylists were describing was the idea of “adding a bit of everything to the design”, the concept illustrated by creations such as the 1958 Buick, the design imperative of which was "combine as many as possible differently-shaped chrome bits & pieces".  Gorp intrinsically was "added on gingerbread" and shouldn't be confused with something like the 1958 Lincoln which was relative unadorned (ie un-gorped) and gained its distinctiveness from the design imperative "combine as many as possible shapes, curves, lines & scallops.  Of course, the two approaches can appear in unison, witness the 1961 Plymouths.

Some of Detroit's guesswork about public taste: 1958 Buick Limited (gingerbread, left), 1958 (Lincoln) Continental Mark III (shapes, right) and 1961 Plymouth Fury (everything, right).

The phrase “gin up” (enliven, make more exciting) is now often used as “gee-up” but the original was first recorded in 1887 (“ginning” (the act of removing seeds from cotton with a cotton gin) in use by at least 1825) and while it’s been speculated there may be some link with “gin” (in the sense of “engine”, the best known being the “cotton gin”) most etymologists think it improbable and think it more likely the origin lies in the characteristics if the root of the plant as used in food (spicy, pizzazz) and most compelling is the entry for feague (used in its equine sense): “...to put ginger up a horse's fundament, and formerly, as it is said, a live eel, to make him lively and carry his tail well; it is said, a forfeit is incurred by any horse-dealer's servant, who shall shew a horse without first feaguing him. The figurative use of feague (encouraging or spiriting one up) has faded but “gee up” remains common.”  So, for dressage or other equestrian competitions in which the judges liked to see a horse’s tail elegantly raised (al la the high ponytail perfected by the singer Ariana Grande (b 1993)), a stable-hand’s trick for achieving this was to insert an irritant (such as a piece of peeled raw ginger or a live eel) in its anus, an additional benefit being it “increased the liveliness of the beast”.  That means when modern young folk speak of “geeing up” or a “a gee up”, they’re referring (figuratively) to shoving some ginger up someone’s rectum; presumably, most are unaware of the linguistic tradition.

Ariana Grande and ponytail.

According to Ariana Grande, the “snatched high ponytail” she made her signature look was better described as a “high extension ponytail” because extensions were used for added length and volume.  It’s a dramatic look but the health and beauty site Self cautioned wearing the style is not risk-free and for some wearers pain may be unavoidable.  Interviewed, dermatologist Dr Samantha Conrad explained hair follicles are the “little pockets of skin that surround the root of a hair” while the “nerves and blood vessels in the scalp feed those roots”.  What happens when hair is pulled tightly back and elevated, it puts the hair “at a sharp angle”, placing “tension on the follicles”, causing “some strangulation of the unit”.  Because this tension is exerted on the nerve endings, there can be pain, something exacerbated if the hair is long and thick (or augmented with extensions) because the extent of the tension is so influenced by weight, physics dictating additional mass will induce greater “traction on the hair follicle”.  Pain obviously can be an issued but the consequences can be more serious, dermatologist Dr Joshua Zeichner explaining “chronic traction on the hair follicles can cause permanent thinning of the hair”, a phenomenon described as “traction alopecia”.

Ariana Grande, on stage, Coachella, November, 2018.

Ominous as all that sounds, the doctors say it’s not necessary entirely to abandon the high ponytail because the issue isn’t the style but the implementation, the critical factor being how tightly the hair is pulled from the scalp.  Tension alopecia can occur with any tightly-pulled ponytail, plait or braid so the trick is to avoid excessive tension, the recommended approach to create a “high pony” and then gradually loosen the area in front of the elastic.  Obviously, the greater the mass of the hair, physics  dictates it will be less inclined to retain a shape tending from the vertical at the scalp so those handling much volume will probably have to resort to some sort of at least semi-rigid tubular device through which the strands can pass to be supported.

Roland DG's 50 Shades of Ginger illustrates the extent to which the spectrum can spread (centre).  Natural redhead Lindsay Lohan (b 1986, left) in 2012 illustrates a classic implementation of what most probably thing of as “ginger hair” while Jessica Gagen (b 1996; Miss England 2022, Miss World Europe 2023 & Miss United Kingdom 2024) appears (during heatwave, right) with what would be classified by many as a “light copper” rather than some hue of “ginger”.  Interestingly, reflecting the often disparaging use of the word (in the context of hair) “ginger” appears only infrequently on manufacturers' hair dye color charts.

Ginger can be used to describe those with “red” hair (a term which covers quite a range including shades of ginger in the conventional sense that is used of color) and such may be jocular, in disparagement or neutral.  In slang, a “ginger minger” was “an unattractive woman with ginger hair” and their “ginger minge” was their pubic hair; the male equivalent was a “ginger knob”.  In the hierarchy of vulgar slang, fire-crotch (a person who has red pubic hair) probably is worse but it should not be confused with “lightning crotch” (in obstetrics, the condition (suffered late in pregnancy), of having intense pain shoot through the vaginal area, induced especially by the baby's head lowering and bumping into the pelvis).  While a “normal symptom of pregnancy” and not typically a cause for medical intervention, it can be unpleasant; what is happening is the fetus is applying pressure on the cervix or the nerves surrounding the cervix (the cervix the lowest part of the uterus where a fetus develops).

One with a preference for ginger-haired souls could be said to be a gingerphile while one with an aversion would be a gingerphobe.  The matter of gingerphobia was explored by the US television cartoon show South Park (on Paramount+'s Comedy Central since 1997) in the episode Ginger Kids (season 9, episode 11, November 2005) in which was introduced the noun gingervitis (a portmenteau word, the construct being ginger +‎ (ging)ivitis); in pathology, the condition gingivitis is an inflammation of the gums or gingivae.  What South Park’s writers did was provide the gingerphobic with something of a rationale, gingervitis treating red headedness as if it were a disease or affliction.  Linguistically, it could have been worse: in German the synonym for gingivitis is the compound noun Zahnfleischentzündung and “zahnfleischentzündungvitis” sounds an even more distressing condition.  Neither gingerphobia nor gingervitis have ever appeared in the American Psychiatric Association's (APA) Diagnostic and Statistical Manual of Mental Disorders (the DSM, in nine editions 1952-2022). 

In Cockney rhyming slang (a cant used by Cockneys in which a word or phrase is replaced by a rhyming word or phrase, this word or phrase then often being abbreviated to its first syllable or syllables, or its first word with the word chosen as the rhyme sometimes sharing attributes with the word it replaces) “ginger” meant “a bit of a homosexual” (based on the “beer” in “ginger beer” (ie “queer”)).  If that didn’t please, there was also (1) “Brighton Pier” (queer from “pier”), (2) “iron” (poof from “iron hoof”), (3) “perry” (homo from “Como) (this was purely phonetic, the popular singer Perry Coma (1912-2001) was not gay) and (4) “haricot” (queen from “haricot bean).  However, the guides caution “stoke” (bent from “Stoke-on-Trent”) references “bent” in the sense of both “gay” and “criminal” so it should be deployed with care.

The modest root of the plant (partially sliced, top left) and some of the packaged confectionery which are ginger-based.

For a variety of purposes (culinary, zoological, botanical, geological etc, dozens of derived forms have been created including: African ginger, aromatic ginger, baby ginger, black ginger, bleached ginger, blue ginger, butterfly ginger, Canada ginger, Chinese ginger, Cochin ginger, common ginger, dry ginger, Egyptian ginger, gingerade, ginger ale, ginger beer, gingerbread, ginger bug, ginger cordial, gingerette, ginger grass, ginger group, ginger-hackled, Ginger Island, gingerism, gingerlike, gingermint, ginger ninja, ginger nut, gingernut, gingerol, gingerous, gingerphobe, gingerphobia, ginger-pop, ginger root, gingersnap, gingersome, ginger wine, gingery, gingette, green ginger, Indian ginger, Jamaica ginger, Japanese ginger, kahili ginger, knock down ginger, knock-knock ginger, limed ginger, mango ginger, new ginger, pinecone ginger, pink ginger, race ginger, red ginger, sand ginger, sea ginger, shampoo ginger, shell ginger, Siamese ginger, spiral ginger, spring ginger, stem ginger, stone-ginger, Thai ginger, torch ginger, white ginger, wild ginger, yellow ginger & young ginger.

In De materia medica (On Medical Material), his five volume encyclopedic pharmacopeia on herbal medicine and related medicinal substances, the Ancient Greek physician Pedanius Dioscorides (circa 40-circa 90) included an entry for ζιγγίβερις (zingiberis) (ginger) as treatment for stomach and digestive ailments, in addition to its properties as “a warming spice”.  The historian Pliny the Elder (24-79) also discussed zingiber, noting its origin from Arabia and India and the use in medicine, especially for the stomach and digestion.  The use was picked up by physicians (officially recognized and not) in many places, both as a stimulant and acarminative (preventing the development of gas in the digestive tract) but despite the persistent myth, no document has ever been unearthed which suggests in Antiquity ginger was ever recommended as “sea-sickness medicine”.  Despite that, in the modern age, ginger is sometimes promoted as a cure (or at least an ameliorant) for nausea suffered at sea, in flight, while driving or motion-sickness in general and there appears to be some evidence to support the use.

Google ngram for Ginger group: Because of the way Google harvests data for their ngrams, they’re not literally a tracking of the use of a word in society but can be usefully indicative of certain trends, (although one is never quite sure which trend(s)), especially over decades.  As a record of actual aggregate use, ngrams are not wholly reliable because: (1) the sub-set of texts Google uses is slanted towards the scientific & academic and (2) the technical limitations imposed by the use of OCR (optical character recognition) when handling older texts of sometime dubious legibility (a process AI should improve).  Where numbers bounce around, this may reflect either: (1) peaks and troughs in use for some reason or (2) some quirk in the data harvested.

In another example of why English (in some ways simple and logical) must seem bafflingly inconsistent to those learning the tongue, while “ginger up” and “ginger group” are phrases related to “imparting piquancy or enlivening someone or something”, to speak of proceeding “gingerly” means “acting hesitantly; with great caution”.  The explanation is the divergence is not the result of a word shifting meaning in two directions but instead two different etymologies converging phonetically in modern English.  The figurative sense of “ginger up” (familiar to the young as “gee up”) meaning “add energy or enthusiasm) emerged in the nineteenth century and came from the equestrian practice of putting ginger (or so some other irritants) in or near a horse’s anus so it would be more “spirited” (performing with greater verve or liveliness) and appear with its tail held high.  From this (the expression rather than stuff shoved in the rectum) came “ginger group” which described a (usually) small and energetic faction within a larger organization which aimed to stimulate or invigorate change or action.  The first known use of the term was in 1920s British politics.

Confusingly “gingerly” is unrelated to “ginger” and has nothing to do with novel uses of spice in equine management.  Developing in parallel with but separately from Middle English, gingerly was from the Old French gensor & gencier (which endures in Modern French as gentil (delicate; dainty), from the Latin gentilis.  Appending the suffix -ly turned adjective into adverb and by the sixteenth century gingerly came to mean “delicately, with grace or refinement” and by the early 1900s the idea of a “refined or dainty manner” evolved into “cautiously; with care”.  Gingerly is thus a “false cognate” with ginger (the spice).  There the linguistic tangle should end but because of the development of modern slang, “ginger” has established an (informal) link with “gingerly” through “gingerness” which can be both (1) a synonym for “gingerliness” (a gingerly state, attitude or behaviour and (2) in informal (sometimes derogatory) use: redheadedness.

Saturday, July 5, 2025

Futurism

Futurism (pronounced fyoo-chuh-riz-uhm)

(1) A movement in avant-garde art, developed originally by a group of Italian artists in 1909 in which forms (derived often from the then novel cubism) were used to represent rapid movement and dynamic motion  (sometimes with initial capital letter)

(2) A style of art, literature, music, etc and a theory of art and life in which violence, power, speed, mechanization or machines, and hostility to the past or to traditional forms of expression were advocated or portrayed (often with initial capital letter).

(3) As futurology, a quasi-discipline practiced by (often self-described) futurologists who attempt to predict future events, movements, technologies etc.

(4) In the theology of Judaism, the Jewish expectation of the messiah in the future rather than recognizing him in the presence of Christ.

(5) In the theology of Christianity, eschatological interpretations associating some Biblical prophecies with future events yet to be fulfilled, including the Second Coming.

1909: From the Italian futurismo (literally "futurism" and dating from circa 1909), the construct being futur(e) + -ism.  Future was from the Middle English future & futur, from the Old French futur, (that which is to come; the time ahead) from the Latin futūrus, (going to be; yet to be) which (as a noun) was the irregular suppletive future participle of esse (to be) from the primitive Indo-European bheue (to be, exist; grow).  It was cognate with the Old English bēo (I become, I will be, I am) and displaced the native Old English tōweard and the Middle English afterhede (future (literally “afterhood”) in the given sense.  The technical use in grammar (of tense) dates from the 1520s.  The –ism suffix was from the Ancient Greek ισμός (ismós) & -isma noun suffixes, often directly, sometimes through the Latin –ismus & isma (from where English picked up ize) and sometimes through the French –isme or the German –ismus, all ultimately from the Ancient Greek (where it tended more specifically to express a finished act or thing done).  It appeared in loanwords from Greek, where it was used to form abstract nouns of action, state, condition or doctrine from verbs and on this model, was used as a productive suffix in the formation of nouns denoting action or practice, state or condition, principles, doctrines, a usage or characteristic, devotion or adherence (criticism; barbarism; Darwinism; despotism; plagiarism; realism; witticism etc).  Futurism, futurology, & futurology are nouns, futurist is a noun & adjective and futuristic is an adjective; the noun plural is futurisms.

Lindsay Lohan in Maison Martin Margiela (b 1957) Futuristic Eyewear.

As a descriptor of the movement in art and literature, futurism (as the Italian futurismo) was adopted in 1909 by the Italian poet Filippo Tommaso Marinetti (1876-1944) and the first reference to futurist (a practitioner in the field of futurism) dates from 1911 although the word had been used as early as 1842 in Protestant theology in the sense of “one who holds that nearly the whole of the Book of Revelations refers principally to events yet to come”.  The secular world did being to use futurist to describe "one who has (positive) feelings about the future" in 1846 but for the remainder of the century, use was apparently rare.  The (now probably extinct) noun futurity was from the early seventeenth century.  The noun futurology was introduced by Aldous Huxley (1894-1963) in his book Science, Liberty and Peace (1946) and has (for better or worse), created a minor industry of (often self-described) futurologists.  In theology, the adjective futuristic came into use in 1856 with reference to prophecy but use soon faded.  In concert with futurism, by 1915 it referred in art to “avant-garde; ultra-modern” while by 1921 it was separated from the exclusive attachment to art and meant also “pertaining to the future, predicted to be in the future”, the use in this context spiking rapidly after World War II (1939-1945) when technological developments in fields such as ballistics, jet aircraft, space exploration, electronics, nuclear physics etc stimulated interest in such progress.

Untouched: Crooked Hillary Clinton (b 1947; US secretary of state 2009-2013) & Bill Clinton (b 1946; US president 1993-2001) with cattle, 92nd Annual Hopkinton State Fair, Contoocook, New Hampshire, September 2007.

Futures, a financial instrument used in the trade of currencies and commodities appeared first in 1880; they allow (1) speculators to bet on price movements and (2) producers and sellers to hedge against price movements and in both cases profits (and losses) can be booked against movement up or down.  Futures trading can be lucrative but is also risky, those who win gaining from those who lose and those in the markets are usually professionals.  The story behind crooked Hillary Clinton's extraordinary profits in cattle futures (not a field in which she’d previously (or has subsequently) displayed interest or expertise) while “serving” as First Lady of Arkansas ((1979–1981 & 1983–1992) remains murky but it can certainly be said that for an apparently “amateur” dabbling in a market played usually by experienced professionals, she was remarkably successful and while perhaps there was some luck involved, her trading record was such it’s a wonder she didn’t take it up as a career.  While many analysts have, based on what documents are available, commented on crooked Hillary’s somewhat improbable (and apparently sometime “irregular”) foray into cattle futures, there was never an “official governmental investigation” by an independent authority and no thus adverse findings have ever been published.  

The Arrival (1913), oil on canvas by Christopher Richard Wynne Nevinson (1889-1946), Tate Gallery.

Given what would unfold over during the twentieth century, it’s probably difficult to appreciate quite how optimistic was the Western world in the years leading up to the World War I (1914-1918).  Such had been the rapidity of the discovery of novelties and of progress in so many fields that expectations of the future were high and, beginning in Italy, futurism was a movement devoted to displaying the energy, dynamism and power of machines and the vitality and change they were bringing to society.  It’s also often forgotten that when the first futurist exhibition was staged in Paris in 1912, the critical establishment was unimpressed, the elaborate imagery with its opulence of color offending their sense of refinement, now so attuned to the sparseness of the cubists.

The Hospital Train (1915), oil on canvas by Gino Severini (1883-1966), Stedelijk Museum.

Futurism had debuted with some impact, the Paris newspaper Le Figaro in 1909 publishing the manifesto by Italian poet Filippo Tommaso Marinetti. Marinetti which dismissed all that was old and celebrated change, originality, and innovation in culture and society, something which should be depicted in art, music and literature. Marinetti exalted in the speed, power of new technologies which were disrupting society, automobiles, aeroplanes and other clattering machines.  Whether he found beauty in the machines or the violence and conflict they delivered was something he left his readers to decide and there were those seduced by both but his stated goal was the repudiation of traditional values and the destruction of cultural institutions such as museums and libraries.  Whether this was intended as a revolutionary roadmap or just a provocation to inspire anger and controversy is something historians have debated.  Assessment of Marinetti as a poet has always been colored by his reputation as a proto-fascist and some treat as "fake mysticism" his claim his "visions" of the future and the path to follow to get there came to him in the moment of a violent car crash. 

Futurismo: Uomo Nuovo (New Man, 1918), drawing by Mario Sironi (1885-1961).

As a technique, the futurist artists borrowed much from the cubists, deploying the same fragmented and intersecting plane surfaces and outlines to render a number of simultaneous, overlaid views of an object but whereas the cubists tended to still life, portraiture and other, usually static, studies of the human form, the futurists worshiped movement, their overlays a device to depict rhythmic spatial repetitions of an object’s outlines during movement.  People did appear in futurist works but usually they weren’t the focal point, instead appearing only in relation to some speeding or noisy machine.  Some of the most prolific of the futurist artists were killed in World War I and as a political movement it didn’t survive the conflict, the industrial war dulling the public appetite for the cult of the machine.  However, the influence of the compositional techniques continued in the 1920s and contributed to art deco which, in more elegant form, would integrate the new world of machines and mass-production into motifs still in use today.

Motociclista (Motorcyclist, circa 1924), oil on canvas by Mario Sironi.

By the early twentieth century when the Futurism movement emerged, machines and mechanism were already hundreds of years old (indeed the precursor devices pre-date Christ) but what changed was the new generations of machines had become sexy (at least in the eyes of men), associated as they were with something beyond mere functionalism: speed and style.  While planes, trains & automobiles all attracted the futurists, the motorcycle was a much-favored motif because it possessed an intimacy beyond other forms of transportation in that, literally it was more an extension of the human body, the rider at speed conforming to the shape of the structure fashioned for aerodynamic efficiency with hands and feet all directly attached to the vital controls: machine as extension of man.

The Modern Boy No. 100, Vol 4, Week Ending 4 January, 1930.

The Modern Boy (1928-1939) was, as the name implies, a British magazine targeted at males aged 12-18 and the content reflected the state of mind in the society of the inter-war years, the 1930s a curious decade of progress, regression, hope and despair.  Although what filled much of the pages (guns, military conquest and other exploits, fast cars and motorcycles, stuff the British were doing in other peoples’ countries) would today see the editors cancelled or visited by one of the many organs of the British state concerned with the suppression of such things), it was what readers (presumably with the acquiescence of their parents) wanted.  Best remembered of the authors whose works appeared in The Modern Boy was Captain W.E. Johns (1893–1968), a World War I RFC (Royal Flying Corps) pilot who created the fictional air-adventurer Biggles.  The first Biggles tale appeared in 1928 in Popular Flying magazine (released also as Popular Aviation and still in publication as Flying) and his stories are still sometimes re-printed (although with the blatant racism edited out).  The first Biggles story had a very modern-sounding title: The White FokkerThe Modern Boy was a successful weekly which in 1988 was re-launched as Modern Boy, the reason for the change not known although dropping superfluous words (and much else) was a feature of modernism.  In October 1939, a few weeks after the outbreak of World War II, publication ceased, Modern Boy like many titles a victim of restrictions by the Board of Trade on the supply of paper for civilian use.

Jockey Club Innovation Tower, Hong Kong (2013) by Zaha Hadid (1950-2016).

If the characteristics of futurism in art were identifiable (though not always admired), in architecture, it can be hard to tell where modernism ends and futurism begins.  Aesthetics aside, the core purpose of modernism was of course its utilitarian value and that did tend to dictate the austerity, straight lines and crisp geometry that evolved into mid-century minimalism so modernism, in its pure form, should probably be thought of as a style without an ulterior motive.  Futurist architecture however carried the agenda which in its earliest days borrowed from the futurist artists in that it was an assault on the past but later moved on and in the twenty-first century, the futurist architects seem now to be interested above all in the possibilities offered by advances in structural engineering, functionality sacrificed if need be just to demonstrate that something new can be done.  That's doubtless of great interest at awards dinners where architects give prizes to each other for this and that but has produced an international consensus that it's better to draw something new than something elegant.  The critique is that while modernism once offered “less is more”, with neo-futurist architecture it's now “less is bore”.  Art deco and mid-century modernism have aged well and it will be interesting to see how history judges the neo-futurists.

Monday, June 30, 2025

Bunker

Bunker (pronounced buhng-ker)

(1) A large bin or receptacle; a fixed chest or box.

(2) In military use, historically a fortification set mostly below the surface of the ground with overhead protection provided by logs and earth or by concrete and fitted with above-ground embrasures through which guns may be fired.

(3) A fortification set mostly below the surface of the ground and used for a variety of purposes.

(4) In golf, an obstacle, classically a sand trap but sometimes a mound of dirt, constituting a hazard.

(5) In nautical use, to provide fuel for a vessel.

(6) In nautical use, to convey bulk cargo (except grain) from a vessel to an adjacent storehouse.

(7) In golf, to hit a ball into a bunker.

(8) To equip with or as if with bunkers.

(9) In military use, to place personnel or materiel in a bunker or bunkers (sometimes as “bunker down”).

1755–1760: From the Scottish bonkar (box, chest (also “seat” (in the sense of “bench”) of obscure origin but etymologists conclude the use related to furniture hints at a relationship with banker (bench).  Alternatively, it may be from a Scandinavian source such as the Old Swedish bunke (boards used to protect the cargo of a ship).  The meaning “receptacle for coal aboard a ship” was in use by at least 1839 (coal-burning steamships coming into general use in the 1820s).  The use to describe the obstacles on golf courses is documented from 1824 (probably from the extended sense “earthen seat” which dates from 1805) but perhaps surprisingly, the familiar sense from military use (dug-out fortification) seems not to have appeared before World War I (1914-1918) although the structures so described had for millennia existed.  “Bunkermate” was army slang for the individual with whom one shares a bunker while the now obsolete “bunkerman” (“bunkermen” the plural”) referred to someone (often the man in charge) who worked at an industrial coal storage bunker.  Bunker & bunkerage is a noun, bunkering is a noun & verb, bunkered is a verb and bunkerish, bunkeresque, bunkerless & bunkerlike are adjectives; the noun plural is bunkers.

Just as ships called “coalers” were used to transport coal to and from shore-based “coal stations”, it was “oilers” which took oil to storage tanks or out to sea to refuel ships (a common naval procedure) and these STS (ship-to-ship) transfers were called “bunkering” as the black stuff was pumped, bunker-to-bunker.  That the coal used by steamships was stored on-board in compartments called “coal bunkers” led ultimately to another derived term: “bunker oil”.  When in the late nineteenth century ships began the transition from being fuelled by coal to burning oil, the receptacles of course became “oil bunkers” (among sailors nearly always clipped to “bunker”) and as refining processes evolved, the fuel specifically produced for oceangoing ships came to be called “bunker oil”.

Bunker oil is “dirty stuff”, a highly viscous, heavy fuel oil which is essentially the residue of crude oil refining; it’s that which remains after the more refined and volatile products (gasoline (petrol), kerosene, diesel etc) have been extracted.  Until late in the twentieth century, the orthodox view of economists was its use in big ships was a good thing because it was a product for which industry had little other use and, as essentially a by-product, it was relatively cheap.  It came in three flavours: (1) Bunker A: Light fuel oil (similar to a heavy diesel), (2) Bunker B: An oil of intermediate viscosity used in engines larger than marine diesels but smaller than those used in the big ships and (3) Bunker C: Heavy fuel oil used in container ships and such which use VLD (very large displacement), slow running engines with a huge reciprocating mass.  Because of its composition, Bucker C especially produced much pollution and although much of this happened at sea (unseen by most but with obvious implications), when ships reached harbor to dock, all the smoke and soot became obvious.  Over the years, the worst of the pollution from the burning of bunker oil greatly has been reduced (the work underway even before the Greta Thunberg (b 2003) era), sometimes by the simple expedient of spraying a mist of water through the smoke.

Floor-plans of the upper (Vorbunker) and lower (Führerbunker) levels of the structure now commonly referred to collectively as the Führerbunker.

History’s most infamous bunker remains the Berlin Führerbunker in which Adolf Hitler (1889-1945; Führer (leader) and German head of government 1933-1945 & head of state 1934-1945) spent much of the last few months of his life.  In the architectural sense there were a number of Führerbunkers built, one at each of the semi-permanent Führerhauptquartiere (Führer Headquarters) created for the German military campaigns and several others built where required but it’s the one in Berlin which is remembered as “the Führerbunker”.  Before 1944 when the intensification of the air raids by the RAF (Royal Air Force) and USAAF (US Army Air Force) the term Führerbunker seems rarely to have been used other than by the architects and others involved in their construction and it wasn’t a designation like Führerhauptquartiere which the military and other institutions of state shifted between locations (rather as “Air Force One” is attached not to a specific airframe but whatever aircraft in which the US president is travelling).  In subsequent historical writing, the term Führerbunker tends often to be applied to the whole, two-level complex in Berlin and although it was only the lower layer which officially was designated as that, for most purposes the distinction is not significant.  In military documents, after January, 1945 the Führerbunker was referred to as Führerhauptquartiere.

Führerbunker tourist information board, Berlin, Germany.

Only an information board at the intersection of den Ministergärten and Gertrud-Kolmar-Straße, erected by the German Goverment in 2006 prior to that year's FIFA (Fédération Internationale de Football Association (International Federation of Association Football)) World Cup now marks the place on Berlin's Wilhelmstrasse 77 where once the Führerbunker was located.  The Soviet occupation forces razed the new Reich Chancellery and demolished all the bunker's above-ground structures but the subsequent GDR (Deutsche Demokratische Republik (German Democratic Republic; the old East Germany) 1949-1990) abandoned attempts completely to destroy what lay beneath.  Until after the fall of the Berlin Wall (1961-1989) the site remained unused and neglected, “re-discovered” only during excavations by property developers, the government insisting on the destruction on whatever was uncovered and, sensitive still to the spectre of “Neo-Nazi shrines”, for years the bunker’s location was never divulged, even as unremarkable buildings (an unfortunate aspect of post-unification Berlin) began to appear on the site.  Most of what would have covered the Führerbunker’s footprint is now a supermarket car park.

The first part of the complex to be built was the Vorbunker (upper bunker or forward bunker), an underground facility of reinforced concrete intended only as a temporary air-raid shelter for Hitler and his entourage in the old Reich Chancellery.  Substantially completed during 1936-1937, it was until 1943 listed in documents as the Luftschutzbunker der Reichskanzlei (Reich Chancellery Air-Raid Shelter), the Vorbunker label applied only in 1944 when the lower level (the Führerbunker proper) was appended.  In mid January, 1945, Hitler moved into the Führerbunker and, as the military situation deteriorated, his appearances above ground became less frequent until by late March he rarely saw the sky,  Finally, on 30 April, he committed suicide.

Bunker Busters

Northrop Grumman publicity shot of B2-Spirit from below, showing the twin bomb-bay doors through which the GBU-57 are released.

Awful as they are, there's an undeniable beauty in the engineering of some weapons and it's unfortunate humankind never collectively has resolved exclusively to devote such ingenuity to stuff other than us blowing up each other.  That’s not a new sentiment, being one philosophers and others have for millennia expressed in various ways although since the advent of nuclear weapons, concerns understandably become heightened.  Like every form of military technology ever deployed, once the “genie is out of the bottle” the problem is there to be managed and at the dawn of the atomic age, delivering a lecture in 1936, the British chemist and physicist Francis Aston (1877–1945) (who created the mass spectrograph, winning the 1922 Nobel Prize in Chemistry for his use of it to discover and identify the isotopes in many non-radioactive elements and for his enunciation of the whole number rule) observed:

There are those about us who say that such research should be stopped by law, alleging that man's destructive powers are already large enough.  So, no doubt, the more elderly and ape-like of our ancestors objected to the innovation of cooked food and pointed out the great dangers attending the use of the newly discovered agency, fire.  Personally, I think there is no doubt that sub-atomic energy is available all around us and that one day man will release and control its almost infinite power.  We cannot prevent him from doing so and can only hope that he will not use it exclusively in blowing up his next door neighbor.

The use in June 2025 by the USAF (US Air Force) of fourteen of its Boeing GBU-57 (Guided Bomb Unit-57) Massive Ordnance Penetrators (MOP) bombs against underground targets in Iran (twelve on the Fordow Uranium Enrichment Plant and two on the Natanz nuclear facility) meant “Bunker Buster” hit the headlines.  Carried by the Northrop B-2 Spirit heavy bomber (built between 1989-2000), the GBU-57 is a 14,000 kg (30,000 lb) bomb with a casing designed to withstand the stress of penetrating through layers of reinforced concrete or thick rock.  “Bunker buster” bombs have been around for a while, the ancestors of today’s devices first built for the German military early in World War II (1939-1945) and the principle remains unchanged to this day: up-scaled armor-piercing shells.  The initial purpose was to produce a weapon with a casing strong enough to withstand the forces imposed when impacting reinforced concrete structures, the idea simple in that what was needed was a delivery system which could “bust through” whatever protective layers surrounded a target, allowing the explosive charge to do damage where needed rtaher than wastefully being expended on an outer skin.  The German weapons proved effective but inevitably triggered an “arms race” in that as the war progressed, the concrete layers became thicker, walls over 2 metres (6.6 feet) and ceilings of 5 (16) being constructed by 1943.  Technological development continued and the idea extended to rocket propelled bombs optimized both for armor-piercing and aerodynamic efficiency, velocity a significant “mass multiplier” which made the weapons still more effective.

USAF test-flight footage of Northrop B2-Spirit dropping two GBU-57 "Bunker Buster" bombs.

Concurrent with this, the British developed the first true “bunker busters”, building on the idea of the naval torpedo, one aspect of which was in exploding a short distance from its target, it was highly damaging because it was able to take advantage of one of the properties of water (quite strange stuff according to those who study it) which is it doesn’t compress.  What that meant was it was often the “shock wave” of the water rather than the blast itself which could breach a hull, the same principle used for the famous “bouncing bombs” used for the RAF’s “Dambuster” (Operation Chastise, 17 May 1943) raids on German dams.  Because of the way water behaved, it wasn’t necessary to score the “direct hit” which had been the ideal in the early days of aerial warfare.

RAF Bomber Command archive photograph of Avro Lancaster (built between 1941-1946) in flight with Grand Slam mounted (left) and a comparison of the Tallboy & Grand Slam (right), illustrating how the latter was in most respects a scaled-up version of the former.  To carry the big Grand Slams, 32 “B1 Special” Lancasters were in 1945 built with up-rated Rolls-Royce Merlin V12 engines, the removal of the bomb doors (the Grand Slam carried externally, its dimensions exceeding internal capacity), deleted front and mid-upper gun turrets, no radar equipment and a strengthened undercarriage.  Such was the concern with weight (especially for take-off) that just about anything non-essential was removed from the B1 Specials, even three of the four fire axes and its crew door ladder.  In the US, Boeing went through a similar exercise to produce the run of “Silverplate” B-29 Superfortresses able to carry the first A-bombs used in August, 1945. 

Best known of the British devices were the so called earthquake bombs”, the Tallboy (12,000 lb; 5.4 ton) & Grand Slam (22,000 lb, 10 ton) which, despite the impressive bulk, were classified by the War Office as “medium capacity”.  The terms “Medium Capacity” (MC) & “High Capacity” referenced not the gross weight or physical dimensions but ratio of explosive filler to the total weight of the construction (ie how much was explosive compared to the casing and ancillary components).  Because both had thick casings to ensure penetration deep into hardened targets (bunkers and other structures encased in rock or reinforced concrete) before exploding, the internal dimensions accordingly were reduced compared with the ratio typical of contemporary ordinance.  A High Capacity (HC) bomb (a typical “general-purpose bomb) had a thinner casing and a much higher proportion of explosive (sometimes over 70% of total weight).  These were intended for area bombing (known also as “carpet bombing”) and caused wide blast damage whereas the Tallboy & Grand Slam were penetrative with casings optimized for aerodynamic efficiency, their supersonic travel working as a mass-multiplier.  The Tallboy’s 5,200 lb (2.3 ton) explosive load was some 43% of its gross weight while the Grand Slam’s 9,100 lb (4 ton) absorbed 41%; this may be compared with the “big” 4000 lb (1.8 ton) HC “Blockbuster” which allocated 75% of the gross weight to its 3000 LB (1.4 ton) charge.  Like many things in engineering (not just in military matters) the ratio represented a trade-off, the MC design prioritizing penetrative power and structural destruction over blast radius.  The novelty of the Tallboy & Grand Slam was that as earthquake bombs, their destructive potential was able to be unleashed not necessarily by achieving a direct hit on a target but by entering the ground nearby, the explosion (1) creating an underground cavity (a camouflet) and (2) transmitting a shock-wave through the target’s foundations, leading to the structure collapsing into the newly created lacuna. 

The etymology of camouflet has an interesting history in both French and military mining.  Originally it meant “a whiff of smoke in the face (from a fire or pipe) and in figurative use it was a reference to a snub or slight insult (something unpleasant delivered directly to someone) and although the origin is murky and it may have been related to the earlier French verb camoufler (to disguise; to mask) which evolved also into “camouflage”.  In the specialized military jargon of siege warfare or mining (sapping), over the seventeen and nineteenth centuries “camouflet” referred to “an underground explosion that does not break the surface, but collapses enemy tunnels or fortifications by creating a subterranean void or shockwave”.  The use of this tactic is best remembered from the Western Front in World War I, some of the huge craters now tourist attractions.

Under watchful eyes: Grand Ayatollah Ali Khamenei (b 1939; Supreme Leader, Islamic Republic of Iran since 1989) delivering a speech, sitting in front of the official portrait of the republic’s ever-unsmiling founder, Grand Ayatollah Ruhollah Khomeini (1900-1989; Supreme Leader, Islamic Republic of Iran, 1979-1989).  Ayatollah Khamenei seemed in 1989 an improbable choice as Supreme Leader because others were better credentialed but though cautious and uncharismatic, he has proved a great survivor in a troubled region.

Since aerial bombing began to be used as a strategic weapon, of great interest has been the debate over the BDA (battle damage assessment) and this issue emerged almost as soon as the bunker buster attack on Iran was announced, focused on the extent to which the MOPs had damaged the targets, the deepest of which were concealed deep inside a mountain.  BDA is a constantly evolving science and while satellites have made analysis of surface damage highly refined, it’s more difficult to understand what has happened deep underground.  Indeed, it wasn’t until the USSBS (United States Strategic Bombing Survey) teams toured Germany and Japan in 1945-1946, conducting interviews, economic analysis and site surveys that a useful (and substantially accurate) understanding emerged of the effectiveness of bombing although what technological advances have allowed for those with the resources is the so-called “panacea targets” (ie critical infrastructure and such once dismissed by planners because the required precision was for many reasons rarely attainable) can now accurately be targeted, the USAF able to drop a bomb within a few feet of the aiming point.  As the phrase is used by the military, the Fordow Uranium Enrichment Plant is as classic “panacea target” but whether even a technically successful strike will achieve the desired political outcome remains to be seen.

Mr Trump, in a moment of exasperation, posted on Truth Social of Iran & Israel: “We basically have two countries that have been fighting so long and so hard that they don't know what the fuck they're doing."  Actually, both know exactly WTF they're doing; it's just Mr Trump (and many others) would prefer they didn't do it.

Donald Trump (b 1946; US president 2017-2021 and since 2025) claimed “total obliteration” of the targets while Grand Ayatollah Khamenei admitted only there had been “some damage” and which is closer to the truth should one day be revealed.  Even modelling of the effects has probably been inconclusive because the deeper one goes underground, the greater the number of variables in the natural structure and the nature of the internal built environment will also influence blast behaviour.  All experts seem to agree much damage will have been done but what can’t yet be determined is what has been suffered by the facilities which sit as deep as 80 m (260 feet) inside the mountain although, as the name implies, “bunker busters” are designed for buried targets and it’s not always required for blast directly to reach target.  Because the shock-wave can travel through earth & rock, the effect is something like that of an earthquake and if the structure sufficiently is affected, it may be the area can be rendered geologically too unstable again to be used for its original purpose.

Within minutes of the bombing having been announced, legal academics were being interviewed (though not by Fox News) to explain why the attacks were unlawful under international law and in a sign of the times, the White House didn't bother to discuss fine legal points like the distinction between "preventive & pre-emptive strikes", preferring (like Fox News) to focus on the damage done.  However, whatever the murkiness surrounding the BDA, many analysts have concluded that even if before the attacks the Iranian authorities had not approved the creation of a nuclear weapon, this attack will have persuaded them one is essential for “regime survival”, thus the interest in both Tel Aviv and (despite denials) Washington DC in “regime change”.  The consensus seems to be Grand Ayatollah Khamenei had, prior to the strike, not ordered the creation of a nuclear weapon but that all energies were directed towards completing the preliminary steps, thus the enriching of uranium to ten times the level required for use in power generation; the ayatollah liked to keep his options open.  So, the fear of some is the attacks, even if they have (by weeks, months or years) delayed the Islamic Republic’s work on nuclear development, may prove counter-productive in that they convince the ayatollah to concur with the reasoning of every state which since 1945 has adopted an independent nuclear deterrent (IND).  That reasoning was not complex and hasn’t changed since first a prehistoric man picked up a stout stick to wave as a pre-lingual message to potential adversaries, warning them there would be consequences for aggression.  Although a theocracy, those who command power in the Islamic Republic are part of an opaque political institution and in the struggle which has for sometime been conducted in anticipation of the death of the aged (and reportedly ailing) Supreme Leader, the matter of “an Iranian IND” is one of the central dynamics.  Many will be following what unfolds in Tehran and the observers will not be only in Tel Aviv and Washington DC because in the region and beyond, few things focus the mind like the thought of ayatollahs with A-Bombs.

Of the word "bust"

The Great Bust: The Depression of the Thirties (1962) by Jack Lang (left), highly qualified content provider Busty Buffy (b 1996, who has never been accused of misleading advertising, centre) and The people's champion, Mr Lang, bust of Jack Lang, painted cast plaster by an unknown artist, circa 1927, National Portrait Gallery, Canberra, Australia (right).  Remembered for a few things, Jack Lang (1876–1975; premier of the Australian state of New South Wales (NSW) 1925-1927 & 1930-1932) remains best known for having in 1932 been the first head of government in the British Empire to have been sacked by the Crown since William IV (1765–1837; King of the UK 1830-1837) in 1834 dismissed Lord Melbourne (1779–1848; prime minister of the UK 1834 & 1835-1841).

Those learning English must think it at least careless things can both be (1) “razed to the ground” (totally to destroy something (typically a structure), usually by demolition or incineration) and (2) “raised to the sky” (physically lifted upwards).  The etymologies of “raze” and “raise” differ but they’re pronounced the same so it’s fortunate the spellings vary but in other troublesome examples of unrelated meanings, spelling and pronunciation can align, as in “bust”.  When used in ways most directly related to human anatomy: (1) “a sculptural portrayal of a person's head and shoulders” & (2) “the circumference of a woman's chest around her breasts” there is an etymological link but these uses wholly are unconnected with bust’s other senses.

Bust of Lindsay Lohan in white marble by Stable Diffusion.  Sculptures of just the neck and head came also to be called “busts”, the emphasis on the technique rather than the original definition.

Bust in the sense of “a sculpture of upper torso and head” dates from the 1690s and was from the sixteenth century French buste, from the Italian busto (upper body; torso), from the Latin bustum (funeral monument, tomb (although the original sense was “funeral pyre, place where corpses are burned”)) and it may have emerged (as a shortened form) from ambustum, neuter of ambustus (burned around), past participle of amburere (burn around, scorch), the construct being ambi- (around) + urere (to burn),  The alternative etymology traces a link to the Old Latin boro, the early form of the Classical Latin uro (to burn) and it’s though the development in Italian was influenced by the Etruscan custom of keeping the ashes of the dead in an urn shaped like the person when alive.  Thus the use, common by the 1720s of bust (a clipping from the French buste) being “a carving of the “trunk of the human body from the chest up”.  From this came the meaning “dimension of the bosom; the measurement around a woman's body at the level of her breasts” and that evolved on the basis of a comparison with the sculptures, the base of which was described as the “bust-line”, the term still used in dress-making (and for other comparative purposes as one of the three “vital statistics” by which women are judged (bust, waist, hips), each circumference having an “ideal range”).  It’s not known when “bust” and “bust-line” came into oral use among dress-makers and related professions but it’s documented since the 1880s.  Derived forms (sometimes hyphenated) include busty (tending to bustiness, thus Busty Buffy's choice of stage-name), overbust & underbust (technical terms in women's fashion referencing specific measurements) and bustier (a tight-fitting women's top which covers (most or all of) the bust.

Benito Mussolini (1883-1945; Duce (leader) & Prime-Minister of Italy 1922-1943) standing beside his “portrait bust” (1926).

The bust was carved by Swiss sculptor Ernest Durig (1894–1962) who gained posthumous notoriety when his career as a forger was revealed with the publication of his drawings which he’d represented as being from the hand of the French sculptor Auguste Rodin (1840-1917) under whom he claimed to have studied.  Mussolini appears here in one of the subsequently much caricatured poses which were a part of his personality cult.  More than one of the Duce's counterparts in other nations was known to have made fun of some of the more outré poses and affectations, the outstretched chin, right hand braced against the hip and straddle-legged stance among the popular motifs. 

“Portrait bust” in marble (circa 1895) of (1815-1989; chancellor of the German Empire (the "Second Reich") 1871-1890) by the German Sculptor Reinhold Begas (1831-1911).

 In sculpture, what had been known as the “portrait statue” came after the 1690s to be known as the “portrait bust” although both terms meant “sculpture of upper torso and head” and these proved a popular choice for military figures because the aspect enabled the inclusion of bling such as epaulettes, medals and other decorations and being depictions of the human figure, busts came to be vested with special significance by the superstitious.  In early 1939, during construction of the new Reich Chancellery in Berlin, workmen dropped one of the busts of Otto von Bismarck by Reinhold Begas, breaking it at the neck.  For decades, the bust had sat in the old Chancellery and the building’s project manager, Albert Speer (1905–1981; Nazi court architect 1934-1942; Nazi minister of armaments and war production 1942-1945), knowing Adolf Hitler (1889-1945; Führer (leader) and German head of government 1933-1945 & head of state 1934-1945) believed the Reich Eagle toppling from the post-office building right at the beginning of World War I had been a harbinger of doom for the nation, kept the accident secret, hurriedly issuing a commission to the German sculptor Arno Breker (1900–1991) who carved an exact copy.  To give the fake the necessary patina, it was soaked for a time in strong, black tea, the porous quality of marble enabling the fluid to induce some accelerated aging.  Interestingly, in his (sometimes reliable) memoir (Erinnerungen (Memories or Reminiscences) and published in English as Inside the Third Reich (1969)), even the technocratic Speer admitted of the accident: “I felt this as an evil omen”.

The other senses of bust (as a noun, verb & adjective) are diverse (and sometimes diametric opposites and include: “to break or fail”; “to be caught doing something unlawful / illicit / disgusting etc”; “to debunk”; “dramatically or unexpectedly to succeed”; “to go broke”; “to break in (horses, girlfriends etc): “to assault”; the downward portion of an economic cycle (ie “boom & bust”); “the act of effecting an arrest” and “someone (especially in professional sport) who failed to perform to expectation”.  That’s quite a range and that has meant the creation of dozens of idiomatic forms, the best known of which include: “boom & bust”, “busted flush”, “dambuster”, “bunker buster”,  “busted arse country”, “drug bust”, “cloud bust”, belly-busting, bust one's ass (or butt), bust a gut, bust a move, bust a nut, bust-down, bust loose, bust off, bust one's balls, bust-out, sod buster, bust the dust, myth-busting and trend-busting.  In the sense of “breaking through”, bust was from the Middle English busten, a variant of bursten & bresten (to burst) and may be compared with the Low German basten & barsten (to burst).  Bust in the sense of “break”, “smash”, “fail”, “arrest” etc was a creation of mid-nineteenth century US English and is of uncertain inspiration but most etymologists seem to concur it was likely a modification of “burst” effected with a phonetic alteration but it’s not impossible it came directly as an imperfect echoic of Germanic speech.  The apparent contradiction of bust meaning both “fail” and “dramatically succeed” happened because the former was an allusion to “being busted” (ie broken) while the latter meaning used the notion of “busting through”.