Showing posts with label Donald Trump. Show all posts
Showing posts with label Donald Trump. Show all posts

Friday, August 8, 2025

Carnival

Carnival (pronounced kahr-nuh-vuhl)

(1) A traveling amusement show, having sideshows, rides etc.

(2) Any merrymaking, revelry, or festival, as a program of sports or entertainment.

(3) In the Christian ecclesiastical calendar, the season immediately preceding Lent, often observed with merrymaking; Shrovetide.

(4) A festive occasion or period marked by merrymaking, processions etc and historically much associated with Roman Catholic countries in the period just before Lent.

(5) A sports meeting.

(6) In literary theory (as the noun carnivalization & verb carnivalize), to subvert (orthodox assumptions or literary styles) through humour and chaos.

(7) In sociology, a context in which transgression or inversion of the social order is given temporary license (an extension of the use in literary theory).

(8) Figuratively, a gaudily chaotic situation.

(9) As a modifier (often as “carnival atmosphere?”) a festive atmosphere.

1540–1550: From the Middle French carnaval, from the Italian carnevale, from the Old Italian carnelevare (taking meat away), from older Italian forms such as the Milanese carnelevale or Old Pisan carnelevare (to remove meat (literally “raising flesh”)) the construct built from the Latin caro (flesh (originally “a piece of flesh”)) from the primitive Indo-European root sker- (to cut) + levare (lighten, raise, remove), from the primitive Indo-European root legwh- (not heavy, having little weight).  Etymologists are divided on the original source of the term used by the Church, the alternatives being (1) carnem levare (to put away flesh), (2) carnem levāmen (meat dismissal), (3) carnuālia (meat-based country feast) and (4) carrus nāvālis (boat wagon; float).  What all agree upon is the ecclesiastical use would have come from one of the forms related to “meat” and the folk etymology favors the Medieval Latin carne vale (flesh, farewell!).  Spreading from the use in Christian feast days, by at least the 1590s it was used in the sense of “feasting or revelry in general” while the meaning “a circus or amusement fair” appears to be a 1920s adoption in US English.  The synonyms can include festival, celebration, festivity, fiesta, jubilee, gala, fete, fête, fest, fair, funfair, exhibit, exhibition, revelry, merriment, rejoicing, jamboree, merrymaking, mardi gras, jollity, revel, jollification, exposition and show.  Which is chosen will be dependent on region, context, history etc and (other than in ecclesiastical use) rules mostly don’t exist but there seem to be a convention that a “sporting carnival” is a less formal event (ie non-championship or lower level competitions).  The alternative spelling carnaval is obsolete.  Carnival & carnivalization are nouns, carnivalize, carnivalizing & carnivalized are verbs, and carnivalic, carnivalistic, carnivalesque, carnivallike, precarnival & noncarnival are adjectives; the noun plural is carnivals.

Not just meat: Francis (1936-2025; pope 2013-2025) on fasting for Lent.

Originally, a carnival was a feast observed by Christians before the Lenten fast began and wasn’t a prelude to a sort of proto-veganism.  It was a part of one of religion’s many dietary rules, one which required Christians to abstain from meat during Lent (particularly on Fridays and during certain fast days), carnival the last occasion on which meat was permissible before Easter.  The Christian practice of abstaining from meat evolved as part of a broader theology of penance, self-denial, and imitation of Christ’s suffering, the rationale combining biblical precedent, symbolic associations and early ascetic traditions, the core of the concept Christ’s 40 days of fasting in the wilderness (Matthew 4:1–11, Luke 4:1–13).  Theologically, the argument was that for one’s eternal soul to enter the Kingdom of Heaven, a price to be paid was Imitatio Christi (earthly participation in Christ’s suffering).  Much the early church valued suffering (for the congregants if not the clergy and nobility) and the notion remains an essential theme in some Christian traditions which can be summed up in the helpful advice: “For everything you do, there’s a price to be paid.

Donald Trump (b 1946; US president 2017-2021 and since 2025) in 2016 on his private jet, fasting for Lent.

By voluntarily abstaining from certain foods, Christians imitated Christ’s self-denial and prepared spiritually for Easter: sharing in His suffering to grow in holiness.  Meat was seen a symbol of feasting and indulgence, an inheritance from Antiquity when “flesh of the beasts of the field” was associated with celebration rather than everyday subsistence, the latter something sustained typically by seafood, fruits and grains so voluntarily (albeit at the behest of the Church) choosing temporarily to renounce meat symbolized forgoing luxury and bodily pleasure, cultivating humility and penitence.  As well as the theological, there was also a quasi-medical aspect to what Tertullian (Quintus Septimius Florens Tertullianus, circa 155–circa 220) commended as “forsaking worldly indulgence” in that fasting took one’s thoughts away from earthly delights, allowing a focus on “prayer and spiritual discipline”, strengthening the soul against “sinful temptations”.  Another layer was added by the Patristics (from the Latin pater (father)), a school of thought which explored the writings and teachings of the early Church Fathers.  Although it was never a universal view in Patrology, there were those who saw in the eating of meat a connection to animal sacrifice and blood, forbidden in the Old Testament’s dietary laws and later spiritualized in Christianity, thus the idea of abstinence as a distancing from violence and sensuality.  Finally, there was the special significance of Fridays, which, as "Good Friday" reflected the remembrance of the crucifixion of Christ and his death at Calvary (Golgotha); the early Christians treated every Friday as a mini-fast and later this would be institutionalized as Lent.

Lindsay Lohan arriving at the Electric Daisy Carnival (left) and detail of the accessory worn on her right thigh (right), Memorial Coliseum, Los Angeles, June 2010.  The knee-high boots were not only stylish but also served to conceal the court-mandated SCRAM (Secure Continuous Remote Alcohol Monitor) bracelet.

The allowance of fish during Lent had both pragmatic and theological origins, its place in the Christian diet a brew of symbolism, biblical precedent and cultural context.  As a legal and linguistic point, in the Greco-Roman scheme of things fish was not thought “flesh meat” which was understood as coming from warm-blooded land animals and birds.  Fish, cold-blooded and aquatic, obviously were different and belonged to a separate category, one which Christianity inherited and an implication of the distinction was seafood being viewed as “everyday food” rather than an indulgent luxury.  This was a thing also of economics (and thus social class), the eating of fish much associated with the poorer coastal dwellers whereas meat was more often seen on urban tables.  Notably, there was also in this a technological imperative: in the pre-refrigeration age, in hot climates, often it wasn’t possible safely to transport seafood inland.  The Biblical symbolism included Christ feeding the multitudes with a few “loaves and fishes” (Matthew 14:13–21), several of the apostles were fishermen who Christ called upon to be “fishers of men” (Mark 1:16–18) and the ichthys (fish symbol) was adopted as early Christian emblem for Christ Himself.  Collectively, this made fish an acceptably modest food for a penitential season.  All that might have been thought justification enough but, typically, Medieval scholars couldn’t resist a bit of gloss and the Italian Dominican friar, philosopher & theologian Saint Thomas Aquinas (1225–1274) decided abstinence aimed to “curb the concupiscence of the flesh” and, because meat generated more “bodily heat” and pleasure than fish, it was forbidden while fish was not.  That wasn’t wholly speculative and reflected the humoral theory from Antiquity, still an orthodoxy during the Middle Ages: fish seen as lighter, cooler, and less sensual.

Notting Hill Carnival, London.

Traditionally, there was also a Lenten prohibition of dairy products and eggs, each proscription with its own historical and symbolic logic and the basis of Shrove Tuesday (Pancake Day) and Easter eggs (though not the definitely un-Christian Easter bunny).  The strictness derived partly from Jewish precedents notably the vegetarian edict in Daniel 10:2–3 and the idea of a “return to Edenic simplicity” where man would eat only plants (Genesis 1:29) but also an aversion to links with sexuality and fertility, eggs obviously connected with sexual reproduction and dairy with lactation.  What this meant was early Christian asceticism sought to curb bodily impulses and anything connected with fleshly generation and (even if indirectly), thoughts of sex.

Historically, a time of absolution when confessions were made in preparation for Lent, Shrovetide described the three days immediately preceding Lent (Shrove Sunday, Shrove Monday & Shrove Tuesday, preceding Ash Wednesday).  The construct being shrove +‎ -tide, the word was from the late Middle English shroftyde.  Shrove was the simple past of shrive, from the Middle English shryven, shriven & schrifen, from the Old English sċrīfan (to decree, pass judgement, prescribe; (of a priest) to prescribe penance or absolution), from the Proto-West Germanic skrīban, from the late Proto-Germanic skrībaną, a borrowing from the Latin scrībō (write).  The word may be compared with the West Frisian skriuwe (to write), the Low German schrieven (to write), the Dutch schrijven (to write), the German schreiben (to write), the Danish skrive (to write), the Swedish skriva (to write) and the Icelandic skrifa (to write).  The –tide suffix was from the Middle English –tide & -tyde, from the Old English -tīd (in compounds), from tīd (point or portion of time, due time, period, season; feast-day, canonical hour).  Before refrigeration, eggs and dairy naturally accumulated during springtime as hens resumed laying and animals produced more milk.  Being banned during Lent, stocks thus had to be consumed lest they be wasted so a pragmatic way to ensure economy of use was the pancake (made with butter, milk & eggs), served on the feast of Shrove Tuesday (Pancake Day).  Following Easter, when eggs returned to the acceptable list, “Easter eggs” were a natural festive marker of the fast’s end.

Carnival Adventure and Carnival Encounter off Australia’s eastern Queensland coast.

Although dubbed “floating Petri dishes” because of the high number of food poisoning & norovirus cases, cruise ships remain popular, largely because, on the basis of cost-breakdown, they offer value-for-money packages few land-based operators can match.  The infections are so numerous because (1) there are thousands of passengers & crew in a closed, crowded environment, (2) an extensive use of buffets and high-volume food service, (3) a frequent turnover of crew & passengers, (4) port visits to places with inconsistent sanitation, health & food safety standards and (5) sometimes delayed reporting and patient isolation.

However, although the popular conception of Medieval Western Christendom is of a dictatorial, priest-ridden culture, the Church was a political structure and it needed to be cognizant of practicalities and public opinion.  Even dictatorships can maintain their authority only with public consent (or at least acquiescence) and in many places the Church recognized burdensome rules could be counter-productive, onerous dietary restrictions resented especially by the majority engaged for their living in hard, manual labor.  Dispensations (formal exceptions) became common with bishops routinely relaxing the rules for the ill, those pregnant or nursing or workers performing physically demanding tasks.  As is a common pattern when rules selectively are eased, a more permissive environment was by the late Middle Ages fairly generalized (other than for those who chose to live by to monastic standards).

Carnival goers enjoying the Sydney Gay & Lesbian Mardi Gras: This is not what Medieval bishops would have associated with the word “carnival” but few events better capture the spirit of the phrase “carnival atmosphere”.

The growth of dispensations (especially in the form of “indulgences” which were a trigger for the Protestant Reformation) was such it occurred to the bishops they’d created a commodity and commodities can be sold.  This happened throughout Europe but, in France and Germany, the “system” became institutionalized, the faithful even able to pay “butter money” for the privilege of eating the stuff over Lent (a kind of inverted “fat tax”!) with the proceeds devoted to that favourite capital works programme of bishops & cardinals: big buildings.  The sixteenth century tower on Normandy’s Rouen Cathedral was nicknamed “Butter Tower” although the funds collected from the “tax” covered only part of the cost; apparently even the French didn’t eat enough butter.  As things turned out, rising prosperity and the population drifts towards towns and cities meant consumption of meat and other animal products increased, making restrictions harder to enforce and the Protestant reformers anyway rejected mandatory fasting rules, damning them as man-made (“Popery!” the most offensive way they could think to express that idea) rather than divine law.  Seeing the writing nailed to the door, one of the results of the Council of Trent (1545–1563) was that while the Church reaffirmed fasting, eggs and dairy mostly were allowed and the ban on meat was restricted to Fridays and certain fast days in the ecclesiastical calendar.

Archbishop Daniel Mannix in his library at Raheen, the Roman Catholic's Church's Episcopal Palace in Melbourne, 1917-1981.

By the twentieth century, it was clear the Holy See was fighting a losing battle and in February 1966, Paul VI (1897-1978; pope 1963-1978) promulgated Apostolic Constitution Paenitemini (best translated as “to be penitent”) making abstinence from meat on Fridays optional outside Lent and retained only Ash Wednesday and Good Friday as obligatory fast days for Catholics.  It was a retreat very much in the corrosive spirit of the Second Vatican Council (Vatican II, 1962-1965) and an indication the Church was descending to a kind of “mix & match” operation, people able to choose the bits they liked, discarding or ignoring anything tiresome or too onerous.  In truth, plenty of priests had been known on Fridays to sprinkle a few drops of holy water on their steak and declare “In the name of our Lord, you are now fish”.  That was fine for priests but for the faithful, dispensation was often the “luck of clerical draw”.  At a time in the late 1940s when there was a shortage of good quality fish in south-east Australia, Sir Norman Gilroy (1896–1977; Roman Catholic Archbishop of Sydney 1940-1971, appointed cardinal 1946) granted dispensation but the stern Dr Daniel Mannix (1864–1963; Roman Catholic Archbishop of Melbourne 1917-1963) refused so when two politicians from New South Wales (Ben Chifley (1885–1951; prime minister of Australia 1945-1949) and Fred Daly (1912–1995)) arrived in the parliamentary dining room for dinner, Chifley’s order was: “steaks for me and Daly, fish for the Mannix men.

In the broad, a carnival was an occasion, event or season of revels, merrymaking, feasting and entertainments (the Spanish fiestas a classic example) although they could assume a political dimension, some carnivals staged to be symbolic of the disruption and subversion of authority.  The idea was a “turning upside down of the established hierarchical order” and names used included “the Feast of Fools”, “the Abbot of Misrule” and “the Boy Bishop”.  With a nod to this tradition, in literary theory, the concept of “carnivalization” was introduced by the Russian philosopher & literary critic Mikhail Bakhtin (1895–1975), the word appearing first in the chapter From the Prehistory of Novelistic Discourse (written in 1940) which appeared in his book The Dialogic Imagination: chronotope and heteroglossia (1975).  What carnivalization described was the penetration or incorporation of carnival into everyday life and its “shaping” effect on language and literature.

The Socratic dialogues (most associated with the writing of the Greek philosophers Xenophon (circa 430–355 BC) and Plato (circa 427-348 BC)) are regarded as early examples of a kind of carnivalization in that what appeared to be orthodox “logic” was “stood on its head” and shown to be illogical although Menippean satire (named after the third-century-BC Greek Cynic Menippus) is in the extent of its irreverence closer to the modern understanding which finds expression in personal satire, burlesque and parody.  Bakhtin’s theory suggested the element of carnival in literature is subversive in that it seeks to disrupts authority and introduce alternatives: a deliberate affront to the canonical thoughts of Renaissance culture.  In modern literary use the usual term is “carnivalesque”, referring to that which seeks to subvert (“liberate” sometimes the preferred word) assumptions or orthodoxies by the use of humor or some chaotic element.  This can be on a grand scale (ie an entire cultural movement) or as localized some malcontent disrupting their book club (usually polite affairs where novels are read and ladies sit around talking about their feelings).

Portrait of Leo Tolstoy (1887), oil on canvas by Ilya Repin (1844-1930), Tretyakov Gallery, Moscow, Russia.

He expanded on the theme in his book Problems of Dostoevsky's Poetics (1929) by contrasting the novels of Leo Tolstoy (1828-1910) and Fyodor Dostoevsky (1821–1881).  Tolstoy’s fiction he classified as a type of “monologic” in which all is subject to the author's controlling purpose and hand, whereas for Dostoevsky the text is “dialogic” or “polyphonic” with an array of different characters expressing a variety of independent views (not “controlled” the author) in order to represent the author's viewpoint.  Thus deconstructed, Bakhtin defined these views as “not only objects of the author's word, but subjects of their own directly significant word as well” and thus vested with their own dynamic, being a liberating influence which, as it were, “conceptualizes” reality, lending freedom to the individual character and subverting the type of “monologic” discourse characteristic of many nineteenth century authors (typified by Tolstoy).

Portrait of Fedor Dostoyevsky (1872), oil on canvas by Vasily Perov (1834-1882), Tretyakov Gallery, Moscow, Russia.

Dostoevsky’s story Bobok (1873) is cited as an exemplar of carnival.  It has characters with unusual freedom to speak because, being dead, they’re wholly disencumbered of natural laws, able to say what they wish and speak truth for fun.  However, Bakhtin did acknowledge this still is literature and didn’t claim a text could be an abstraction uncontrolled by the author (although such things certainly could be emulated): Dostoevsky (his hero) remained in control of his material because the author is the directing agent.  So, given subversion, literary and otherwise, clearly has a history dating back doubtlessly as many millennia as required to find an orthodoxy to subvert, why was the concept of carnivalization deemed a necessary addition to literary theory?  It went to the form of things, carnivalization able especially to subvert because it tended to be presented in ways less obviously threatening than might be typical of polemics or actual violence.

Tuesday, July 29, 2025

Rumble

Rumble (pronounced ruhm-buhl)

(1) A form of low frequency noise

(2) In video game controllers, a haptic feedback vibration.

(3) In the jargon of cardiologists, a quality of a "heart murmur".

(4) In the slang of physicians (as "stomach rumble"), borborygmus (a rumbling sound made by the movement of gas in the intestines).

(5) In slang, a street fight between or among gangs.

(6) As rumble seat (sometimes called dickie seat), a rear part of a carriage or car containing seating accommodation for servants, or space for baggage; known colloquially as the mother-in-law seat (an now also used by pram manufacturers to describe a clip-on seat suitable for lighter infants).

(7) The action of a tumbling box (used to polish stones).

(8) As rumble strip, in road-building, a pattern of variation in a road's surface designed to alert inattentive drivers to potential danger by causing a tactile vibration and audible rumbling if they veer from their lane.

(9) In slang, to find out about (someone or something); to discover the secret plans of another (mostly UK informal and used mostly in forms such as: "I've rumbled her" or "I've been rumbled").

(10) To make a deep, heavy, somewhat muffled, continuous sound, as thunder.

(11) To move or travel with such a sound:

1325-1375: From Middle English verbs rumblen, romblen & rummelyn, frequentative form of romen (make a deep, heavy, continuous sound (also "move with a rolling, thundering sound" & "create disorder and confusion")), equivalent to rome + -le.  It was cognate with the Dutch rommelen (to rumble), the Low German rummeln (to rumble), the German rumpeln (to be noisy) and the Danish rumle (to rumble) and the Old Norse rymja (to roar or shout), all of imitative origin.  The noun form emerged in the late fourteenth century, description of the rear of a carriage dates from 1808, replacing the earlier rumbler (1801), finally formalized as the rumble seat in 1828, a design extended to automobiles, the last of which was produced in 1949.  The slang noun meaning "gang fight" dates from 1946 and was an element in the 1950s "moral panic" about such things.  Rumble is a noun & verb, rumbler is a noun, rumbled is a verb, rumbling is a noun, verb & adjective and rumblingly is an adverb; the noun plural is rumbles.

Opening cut from studio trailer for Lindsay Lohan's film Freakier Friday (Walt Disney Pictures, 2025) available on Rumble.  Founded in 2013 as a kind of “anti-YouTube”, as well as being an online video platform Rumble expanded into cloud services and web hosting.  In the vibrant US ecosystem of ideas (and such), Rumble is interesting in that while also carrying non-controversial content, it’s noted as one of the native environments of conservative users from libertarians to the “lunar right”, thus the oft-used descriptor “alt-tech”.  Rumble hosts Donald Trump’s (b 1946; US president 2017-2021 and since 2025) Truth Social media platform which has a user base slanted towards “alt-this & that” although to some inherently it’s evil because much of its underlying code is in Java.

The Velvet Underground and Nico

Link Wray’s (1929-2005) 1958 instrumental recording Rumble is mentioned as a seminal influence by many who were later influential in some of the most notable forks of post-war popular music including punk, heavy-metal, death-metal, glam-rock, art-rock, proto-punk, psychedelic-rock, avant-pop and the various strains of experimental and the gothic.  Wray’s release of Rumble as a single also gained a unique distinction in that it remains the only instrumental piece ever banned from radio in the United States on purely “musical” grounds, the stations (apparently in some parts “prevailed upon” by the authorities) finding its power chords just too menacing for youth to resist.  It wasn't thought it would “give them ideas” in the political sense (many things banned for that fear) but because the “threatening” sound and title was deemed likely to incite juvenile delinquency and gang violence.  “Rumble” was in the 1950s youth slang for fights between gangs, thus the concern the song might be picked up as a kind of anthem and exacerbate the problems of gang culture by glorifying the phenomenon which had already been the centre of a "moral panic".  There is a science to deconstructing the relationship between musical techniques and the feelings induced in people and the consensus was the use of power chords, distortion, and feedback (then radically different from mainstream pop tunes) was “raw, dark and ominous”, even without lyrics; it’s never difficult to sell nihilism to teenagers.  Like many bans, the action heightened its appeal, cementing its status as an anthem of discontented youth and, on sale in most record stores, sales were strong.

The Velvet Underground & Nico (1967).

Lou Reed (1942-2013) said he spent days listening to Rumble before joining with John Cale (b 1942) in New York in 1964 to form The Velvet Underground.  Their debut album, The Velvet Underground & Nico, included German-born model Nico (1938-1988) and was, like their subsequent releases, a critical and commercial failure but within twenty years, the view had changed, their work now regarded among the most important and influential of the era, critics noting (with only some exaggeration): "Not many bought the Velvet Underground's records but most of those who did formed a band and headed to a garage."  The Velvet Underground’s output built on the proto heavy-metal motifs from Rumble with experimental performances and was noted especially for its controversial lyrical content including drug abuse, prostitution, sado-masochism and sexual deviancy.  However, despite this and the often nihilistic tone, in the decade since Rumble, the counter-culture had changed not just pop music but also America: The Velvet Underground was never banned from radio.

Rumble seat in 1937 Packard Twelve Series 1507 2/4-passenger coupé.  The most expensive of Packard's 1937 line-up, the Twelve was powered by a 473 cubic-inch (7.7 litre) 67o V12 rated at 175 horsepower at 3,200 RPM.  It was best year for the Packard Twelve, sales reaching 1,300 units.  The marque's other distinction in the era was the big Packard limousines were the favorite car of comrade Stalin (1878-1953; Soviet leader 1924-1953), a fair judge or machinery.

The rumble seat was also known as a dicky (also as dickie & dickey) seat in the UK while the colloquial “mother-in-law seat” was at least trans-Atlantic and probably global.  It was an upholstered bench seat mounted at the rear of a coach, carriage or early motorcar and as the car industry evolved and coachwork became more elaborate, increasingly they folded into the body.  The size varied but generally they were designed to accommodate one or two adults although the photographic evidence suggests they could be used also to seat half-a-dozen or more children (the seat belt era decades away).  Why it was called a dicky seat is unknown (the word dates from 1801 and most speculation is in some way related to the English class system) but when fitted on horse-drawn carriages it was always understood to mean "a boot (box or receptacle covered with leather at either end of a coach, the use based on the footwear) with a seat above it for servants".  On European phaetons, a similar fixture was the “spider”, a small single seat or bench for the use of a groom or footman, the name based on the spindly supports which called to mind an arachnid’s legs.  The spider name would later be re-purposed on a similar basis to describe open vehicles and use persists to this day, Italians and others sometimes preferring spyder.  They were sometimes also called jump-seats, the idea being they were used by servants or slaves who were required to “jump off” at their master’s command and the term “jump seat” was later used for the folding seats in long-wheelbase limousines although many coach-builders preferred “occasional seats”.

Rumble seat in 1949 Triumph 2000 Roadster.  The unusual (and doubtless welcome) split-screen was a post-war innovation, the idea recalling the twin-screen phaetons of the inter-war years.  Had they been aware of the things, many passengers in the back seats of convertibles (at highway speeds it was a bad hair day) would have longed for the return of the dual-cowl phaetons.  

The US use of “rumble seat” comes from the horse & buggy age so obviously predates any rumble from an engine’s exhaust system and it’s thought the idea of the rumble was literally the noise and vibration experienced by those compelled to sit above a live axle with 40 inch (1 metre-odd) steel rims on wooden-spoked wheels, sometimes with no suspension system.  When such an arrangement was pulled along rough, rutted roads by several galloping horses, even a short journey could be a jarring experience.  The rumble seat actually didn’t appear on many early cars because the engines lacked power so weight had to be restricted, seating typically limited to one or two; they again became a thing only as machines grew larger and bodywork was fitted.  Those in a rumble seat were exposed to the elements which could be most pleasant but not always and they enjoyed only the slightest protection afforded by the regular passenger compartment’s top & windscreen.  Ford actually offered the option of a folding top with side curtains for the rumble seats on the Model A (1927-1931) but few were purchased, a similar fate suffered by those produced by third party suppliers.  US production of cars with rumble seats ended in 1939 and the last made in England was the Triumph 1800/2000 Roadster (1946-1949) but pram manufacturers have of late adopted the name to describe a seat which can be clipped onto the frame.  Their distinction between a toddler seat and a rumble seat is that the former comes with the stroller and is slightly bigger, rated to hold 50 lbs (23 KG), while the former can hold up to 35 (16).

1935 MG NA Magnette Allingham 2/4-Seater by Whittingham & Mitchel.  Sometimes described by auction houses as a DHC (drophead coupé), this body style (despite what would come to be called 2+2 seating) really is a true roadster.  The scalloped shape of the front seats' squabs appeared also in the early (3.8 litre version; 1961-1964) Jaguar E-Types (1961-1974) but attractive as they were, few complained when they were replaced by a more prosaic but also more accommodating design.

Although most rumble (or dickie) seats were mounted in an aperture separated from the passenger compartment, in smaller vehicles the additional seat often was integrated but became usable (by people) only when the hinged cover was raised; otherwise, the rear-seat cushion was a “parcel shelf”.  The MG N-Type Magnette (1934-1936) used a 1271 cm3 (78 cubic inch) straight-six and while the combination of that many cylinders and a small displacement sounds curious, the configuration was something of an English tradition and a product of (1) a taxation system based on cylinder bore and (2) the attractive economies of scale and production line rationalization of “adding two cylinders” to existing four-cylinder units to achieve greater, smoother power with the additional benefit of retaining the same tax-rate.  Even after the taxation system was changed, some small-capacity sixes were developed as out-growths of fours.  Despite the additional length of the engine block, many N-type Magnettes were among the few front-engined cars to include a “frunk” (a front trunk (boot)), a small storage compartment which sat between cowl (scuttle) and engine.

Friday, July 18, 2025

Mural

Mural (pronounced myoor-uhl)

(1) A large picture painted or affixed directly on a wall or ceiling.

(2) A greatly enlarged photograph attached directly to a wall.

(3) A wallpaper pattern representing a landscape or the like, often with very widely spaced repeats so as to produce the effect of a mural painting on a wall of average size; sometimes created as a trompe l'oeil (“deceives the eye”).

(4) Of, relating to, or resembling a wall.

(5) Executed on or affixed to a wall.

(6) In early astronomy, pertaining to any of several astronomical instruments that were affixed to a wall aligned on the plane of a meridian; formerly used to measure the altitude of celestial bodies.

1400–1450: From the late Middle English mural, from the Latin mūrālis (of or pertaining to a wall), the construct being mūr(us) (wall) + ālis (the Latin suffix added to a noun or numeral to form an adjective of relationship; alternative forms were ārisēlisīlis & ūlis).  The Latin mūrālis was from the Old Latin moiros & moerus, from the primitive Indo-European root mei (to fix; to build fences or fortifications) from which Old English picked-up mære (boundary, border, landmark) and Old Norse gained mæri (boundary, border-land).  In the historic record, the most familiar Latin form was probably munire (to fortify, protect).  The sense of "a painting on a wall" seems to have emerged as late as 1915 as a clipping of "mural-painting" (a painting executed upon the wall of a building), a term in use since at least 1850 and derived from mural in its adjectival form.

The adjective intermural (between walls) dates from the 1650s, from the Latin intermuralis (situated between walls), the construct being from inter- (between) + muralis (pertaining to a wall) from mūrus (wall).  The adjective intramural (within the walls (of a city, building etc)) dates from 1846, the construct being intra- (within) muralis (pertaining to a wall) from mūrus (wall); it was equivalent to Late Latin intramuranus and in English, was used originally in reference to burials of the dead.  It came first to be used in relation to university matters by Columbia in 1871.  Mural is a noun, verb & adjective; muraled is a verb & adjective, muralist & muralism are nouns and muraling is a verb; the noun plural is murals.  The adjectives murallike, muralish & muralesque are non-standard and the adverb murally is unrelated, murally a term from heraldry meaning “with a mural crown” and used mostly in the technical terms “murally crowned” & “murally gorged”.  A mural crown was a crown or headpiece representing city walls or towers and was used as a military decoration in Ancient Rome and later as a symbol in European heraldry; its most common representation was as a shape recalling the alternating merlons (raised structures extending the wall) atop a castle’s turret which provided defensive positions through which archers could fire.  The style remains familiar in some of the turrets which sometimes on the more extravagant McMansions and in the chess piece properly called the rook but also referred to as a castle.

Lindsay Lohan murals in the style of street art (graffiti): In hijab (al-amira) with kebab roll by an unknown street artist, Melbourne, Australia (left), the photograph the artist took as a template (centre) and in a green theme in Welcome to Venice mural by UK-born Californian street artist Jules Muck (b 1978) (right).  While a resident of Venice Beach, Ms Lohan lived next door to former special friend, DJ Samantha Ronson (b 1977).

In multi-cultural Australia, the kebab roll has become a fixture in the fast-food scene with variations extending from vegan to pure meat, the term “kebab” something of a generic term meaning what the vendor decides it means.  Cross-culturally the kebab roll also fills a niche as the standard 3 am snack enjoyed by those leaving night clubs, a place and time at which appetites are heightened.  After midnight, many kebab rolls are sold by street vendors from mobile carts and those in the Middle East will not be surprised to learn barbaric Australians sometimes add pineapple to their roll.  The photograph of Ms Lohan in hijab was taken during a “doorstop” (an informal press conference) after her visit in October 2016 to Gaziantep (known to locals as Antep), a city in the Republic of Türkiye’s south-eastern Anatolia Region.  The purpose of the visit was to meet with Syrian refugees being housed in Gaziantep’s Nizip district and the floral hijab was a gift from one of the residents who presumably assisted with the placement because there’s an art to a well-worn al-amira.  Ms Muck’s work was a gesture to welcome Ms Lohan moving from Hollywood to Venice Beach and the use of green is a theme in many of her works.  Unfortunately, Ms Lohan’s time in Venice Beach was brief because she was compelled to return to New York City after being stalked by the Freemasons.

Mural montage: Donald Trump (b 1946; US president 2017-2021 and since 2025) osculating with Mr Putin (Vladimir Vladimirovich Putin; b 1952; president or prime minister of Russia since 1999), Benjamin Netanyahu (b 1949; Israeli prime minister 1996-1999, 2009-2021 and since 2022), Boris Johnson (b 1964; UK prime-minister 2019-2022), Francis (1936-2025; pope 2013-2025) and “Lyin’ Ted” Cruz (b 1970; US senator (Republican-Texas) since 2013).

Probably not long after the charcoal and ochre of the first cave paintings was seen by someone other than the artist, there emerged the calling of “art critic” and while the most common fork of that well-populated profession focuses on the aesthetic, art has also long been political.  The mural of course has much scope to be controversial because they tend to be (1) big and (2) installed in public spaces, both aspects making the things highly visible.  Unlike a conventionally sized painting which, even if large, a curator can hang in some obscure spot or put into storage, the mural is just where it is and often part of the built environment; there it will be seen.  In art history, few murals have more intriguing tales than Michelangelo’s (Michelangelo di Lodovico Buonarroti Simoni; 1475–1564) ceiling and frescos (1508-1512) in the Vatican’s Sistine Chapel but although there were at the time of the commissioning and completion few theological or political squabbles, there were the Vatican’s usual personal and institutional tensions, cardinals and bishops with their own agendas (some financial) peeking and poking into why Julius II (1443–1513; pope 1503-1513) had handed the juicy contract to someone thought primarily a sculptor rather than a painter.

Sistine Chapel, The Vatican, Rome.

The political stoush came later.  At the time, the nudity had been noted and while some voices were raised in opposition, there was no attempt to censor the work because during the High Renaissance, depictions of nudity (on canvas, in marble etc) were all around including in the Vatican but decades later, during the sittings of the Council of Trent (1545–1563), critiques of “nakedness” in art became more vocal.  That was especially the case after the Counter-Reformation (circa 1550–circa 1670) produced a more severe Church, a development with many repercussions, one of which was the “fig-leaf campaign” in which an artist was commissioned to paint over (especially male) genitalia, the traditional “fig leaf” the preferred device.  Perhaps curiously, despite the early appearance of the motif in the art of Christendom, for centuries the fig leaf wasn’t “obligatory” although they appear often enough that at times they must have been at least “desirable” and in other periods and places clearly “essential”.  The later infamous “Fig Leaf Campaign” was initiated by Pope Paul IV (1476–1559; pope 1555-1559) and continued by his successors although it was most associated with the ruling against “lasciviousness” in religious art made in 1563 by the Council of Trent.  It was something very much in the spirit of the Counter-Reformation and it was Pius IV (1499–1565; pope 1559-1565) who commissioned artist Daniele da Volterra (circa 1509–1566) to paint over the genitalia Michelangelo had depicted on his ceiling, extending his repertoire from strategically positioned leaves to artfully placed draperies or loincloths; Romans to his dying day nicknamed Volterra “Il Braghettone” (the breeches maker).  As late as the nineteenth century Greco-Roman statues from antiquity were still having their genitals covered with fig leaves (sometimes detachable, a trick the British Museum later adopted to protect Victoria’s (1819–1901; Queen of the UK 1837-1901) delicate sensibilities during her infrequent visits).  Another example of practical criticism was the edict by Pius IX (1792–1878; pope 1846-1878) that extant male genitalia on some of the classical statues adorning the Vatican should be “modified” and that involved stonemasons, sculptors and other artisans receiving commissions to “modify or cover” as required, some fig leaves at the time added.  It is however a myth popes sometimes would be seen atop a ladder, chisel in hand, hammering away for not only did they hire "the trades" to do their dirty work, what was done was almost always concealment rather than vandalism.

Then a work in progress, this is one of the few known photographs of Diego Rivera's mural in New York City's Rockefeller Center.  According to the Workers Age of 15 June, 1933, the image was "...taken surreptitiously by one of Rivera's aides... 

Still, no pope ever ordered Michelangelo’s creation painted over but not all artists were so fortunate.  On 9 May 1933 (by coincidence a day when the Nazis publicly were burning books), New York’s very rich Rockefeller family ordered Mexican artist Diego Rivera (1886-1957) to cease work on his mural depicting "human intelligence in control of the forces of nature", then being painted in the great hall of the 70-storey Rockefeller Center in New York City.  Taking photographs of the mural was also prohibited.  What incurred the family’s wrath was the artist's addition of a depiction of Bolshevik revolutionary comrade Vladimir Lenin (1870–1924; head of government of Russia or Soviet Union 1917-1924) against a background of crowds of unemployed workers.  Comrade Lenin had not appeared in the conceptual sketch (entitled Man at the Crossroads Looking with Hope and High Vision to the Choosing of a New and Better Future) the artist had provided prior to the commission being granted.  Nelson Rockefeller (1908–1979; US vice president 1974-1977 and who earned immortality by having "died on the job") genuinely was a modern art fan-boy and attempted to negotiate a compromise but it was the nadir of the Great Depression, marked by plummeting industrial production, bank failures and an unemployment rate approaching 25%; other family members, knowing there was in the air talk of revolution (the Rockefeller family had much to lose), didn’t want unemployed getting ideas.  To them, Lenin was close to being the devil incarnate and "the devil makes work for idle hands".  The mural was covered by a canvas drape until February 1934 when, under cover of darkness, it was broken up and carted off to be dumped, the family dutifully having paid the artist his US$21,000 fee.

Monday, June 30, 2025

Bunker

Bunker (pronounced buhng-ker)

(1) A large bin or receptacle; a fixed chest or box.

(2) In military use, historically a fortification set mostly below the surface of the ground with overhead protection provided by logs and earth or by concrete and fitted with above-ground embrasures through which guns may be fired.

(3) A fortification set mostly below the surface of the ground and used for a variety of purposes.

(4) In golf, an obstacle, classically a sand trap but sometimes a mound of dirt, constituting a hazard.

(5) In nautical use, to provide fuel for a vessel.

(6) In nautical use, to convey bulk cargo (except grain) from a vessel to an adjacent storehouse.

(7) In golf, to hit a ball into a bunker.

(8) To equip with or as if with bunkers.

(9) In military use, to place personnel or materiel in a bunker or bunkers (sometimes as “bunker down”).

1755–1760: From the Scottish bonkar (box, chest (also “seat” (in the sense of “bench”) of obscure origin but etymologists conclude the use related to furniture hints at a relationship with banker (bench).  Alternatively, it may be from a Scandinavian source such as the Old Swedish bunke (boards used to protect the cargo of a ship).  The meaning “receptacle for coal aboard a ship” was in use by at least 1839 (coal-burning steamships coming into general use in the 1820s).  The use to describe the obstacles on golf courses is documented from 1824 (probably from the extended sense “earthen seat” which dates from 1805) but perhaps surprisingly, the familiar sense from military use (dug-out fortification) seems not to have appeared before World War I (1914-1918) although the structures so described had for millennia existed.  “Bunkermate” was army slang for the individual with whom one shares a bunker while the now obsolete “bunkerman” (“bunkermen” the plural”) referred to someone (often the man in charge) who worked at an industrial coal storage bunker.  Bunker & bunkerage is a noun, bunkering is a noun & verb, bunkered is a verb and bunkerish, bunkeresque, bunkerless & bunkerlike are adjectives; the noun plural is bunkers.

Just as ships called “coalers” were used to transport coal to and from shore-based “coal stations”, it was “oilers” which took oil to storage tanks or out to sea to refuel ships (a common naval procedure) and these STS (ship-to-ship) transfers were called “bunkering” as the black stuff was pumped, bunker-to-bunker.  That the coal used by steamships was stored on-board in compartments called “coal bunkers” led ultimately to another derived term: “bunker oil”.  When in the late nineteenth century ships began the transition from being fuelled by coal to burning oil, the receptacles of course became “oil bunkers” (among sailors nearly always clipped to “bunker”) and as refining processes evolved, the fuel specifically produced for oceangoing ships came to be called “bunker oil”.

Bunker oil is “dirty stuff”, a highly viscous, heavy fuel oil which is essentially the residue of crude oil refining; it’s that which remains after the more refined and volatile products (gasoline (petrol), kerosene, diesel etc) have been extracted.  Until late in the twentieth century, the orthodox view of economists was its use in big ships was a good thing because it was a product for which industry had little other use and, as essentially a by-product, it was relatively cheap.  It came in three flavours: (1) Bunker A: Light fuel oil (similar to a heavy diesel), (2) Bunker B: An oil of intermediate viscosity used in engines larger than marine diesels but smaller than those used in the big ships and (3) Bunker C: Heavy fuel oil used in container ships and such which use VLD (very large displacement), slow running engines with a huge reciprocating mass.  Because of its composition, Bucker C especially produced much pollution and although much of this happened at sea (unseen by most but with obvious implications), when ships reached harbor to dock, all the smoke and soot became obvious.  Over the years, the worst of the pollution from the burning of bunker oil greatly has been reduced (the work underway even before the Greta Thunberg (b 2003) era), sometimes by the simple expedient of spraying a mist of water through the smoke.

Floor-plans of the upper (Vorbunker) and lower (Führerbunker) levels of the structure now commonly referred to collectively as the Führerbunker.

History’s most infamous bunker remains the Berlin Führerbunker in which Adolf Hitler (1889-1945; Führer (leader) and German head of government 1933-1945 & head of state 1934-1945) spent much of the last few months of his life.  In the architectural sense there were a number of Führerbunkers built, one at each of the semi-permanent Führerhauptquartiere (Führer Headquarters) created for the German military campaigns and several others built where required but it’s the one in Berlin which is remembered as “the Führerbunker”.  Before 1944 when the intensification of the air raids by the RAF (Royal Air Force) and USAAF (US Army Air Force) the term Führerbunker seems rarely to have been used other than by the architects and others involved in their construction and it wasn’t a designation like Führerhauptquartiere which the military and other institutions of state shifted between locations (rather as “Air Force One” is attached not to a specific airframe but whatever aircraft in which the US president is travelling).  In subsequent historical writing, the term Führerbunker tends often to be applied to the whole, two-level complex in Berlin and although it was only the lower layer which officially was designated as that, for most purposes the distinction is not significant.  In military documents, after January, 1945 the Führerbunker was referred to as Führerhauptquartiere.

Führerbunker tourist information board, Berlin, Germany.

Only an information board at the intersection of den Ministergärten and Gertrud-Kolmar-Straße, erected by the German Goverment in 2006 prior to that year's FIFA (Fédération Internationale de Football Association (International Federation of Association Football)) World Cup now marks the place on Berlin's Wilhelmstrasse 77 where once the Führerbunker was located.  The Soviet occupation forces razed the new Reich Chancellery and demolished all the bunker's above-ground structures but the subsequent GDR (Deutsche Demokratische Republik (German Democratic Republic; the old East Germany) 1949-1990) abandoned attempts completely to destroy what lay beneath.  Until after the fall of the Berlin Wall (1961-1989) the site remained unused and neglected, “re-discovered” only during excavations by property developers, the government insisting on the destruction on whatever was uncovered and, sensitive still to the spectre of “Neo-Nazi shrines”, for years the bunker’s location was never divulged, even as unremarkable buildings (an unfortunate aspect of post-unification Berlin) began to appear on the site.  Most of what would have covered the Führerbunker’s footprint is now a supermarket car park.

The first part of the complex to be built was the Vorbunker (upper bunker or forward bunker), an underground facility of reinforced concrete intended only as a temporary air-raid shelter for Hitler and his entourage in the old Reich Chancellery.  Substantially completed during 1936-1937, it was until 1943 listed in documents as the Luftschutzbunker der Reichskanzlei (Reich Chancellery Air-Raid Shelter), the Vorbunker label applied only in 1944 when the lower level (the Führerbunker proper) was appended.  In mid January, 1945, Hitler moved into the Führerbunker and, as the military situation deteriorated, his appearances above ground became less frequent until by late March he rarely saw the sky,  Finally, on 30 April, he committed suicide.

Bunker Busters

Northrop Grumman publicity shot of B2-Spirit from below, showing the twin bomb-bay doors through which the GBU-57 are released.

Awful as they are, there's an undeniable beauty in the engineering of some weapons and it's unfortunate humankind never collectively has resolved exclusively to devote such ingenuity to stuff other than us blowing up each other.  That’s not a new sentiment, being one philosophers and others have for millennia expressed in various ways although since the advent of nuclear weapons, concerns understandably become heightened.  Like every form of military technology ever deployed, once the “genie is out of the bottle” the problem is there to be managed and at the dawn of the atomic age, delivering a lecture in 1936, the British chemist and physicist Francis Aston (1877–1945) (who created the mass spectrograph, winning the 1922 Nobel Prize in Chemistry for his use of it to discover and identify the isotopes in many non-radioactive elements and for his enunciation of the whole number rule) observed:

There are those about us who say that such research should be stopped by law, alleging that man's destructive powers are already large enough.  So, no doubt, the more elderly and ape-like of our ancestors objected to the innovation of cooked food and pointed out the great dangers attending the use of the newly discovered agency, fire.  Personally, I think there is no doubt that sub-atomic energy is available all around us and that one day man will release and control its almost infinite power.  We cannot prevent him from doing so and can only hope that he will not use it exclusively in blowing up his next door neighbor.

The use in June 2025 by the USAF (US Air Force) of fourteen of its Boeing GBU-57 (Guided Bomb Unit-57) Massive Ordnance Penetrators (MOP) bombs against underground targets in Iran (twelve on the Fordow Uranium Enrichment Plant and two on the Natanz nuclear facility) meant “Bunker Buster” hit the headlines.  Carried by the Northrop B-2 Spirit heavy bomber (built between 1989-2000), the GBU-57 is a 14,000 kg (30,000 lb) bomb with a casing designed to withstand the stress of penetrating through layers of reinforced concrete or thick rock.  “Bunker buster” bombs have been around for a while, the ancestors of today’s devices first built for the German military early in World War II (1939-1945) and the principle remains unchanged to this day: up-scaled armor-piercing shells.  The initial purpose was to produce a weapon with a casing strong enough to withstand the forces imposed when impacting reinforced concrete structures, the idea simple in that what was needed was a delivery system which could “bust through” whatever protective layers surrounded a target, allowing the explosive charge to do damage where needed rtaher than wastefully being expended on an outer skin.  The German weapons proved effective but inevitably triggered an “arms race” in that as the war progressed, the concrete layers became thicker, walls over 2 metres (6.6 feet) and ceilings of 5 (16) being constructed by 1943.  Technological development continued and the idea extended to rocket propelled bombs optimized both for armor-piercing and aerodynamic efficiency, velocity a significant “mass multiplier” which made the weapons still more effective.

USAF test-flight footage of Northrop B2-Spirit dropping two GBU-57 "Bunker Buster" bombs.

Concurrent with this, the British developed the first true “bunker busters”, building on the idea of the naval torpedo, one aspect of which was in exploding a short distance from its target, it was highly damaging because it was able to take advantage of one of the properties of water (quite strange stuff according to those who study it) which is it doesn’t compress.  What that meant was it was often the “shock wave” of the water rather than the blast itself which could breach a hull, the same principle used for the famous “bouncing bombs” used for the RAF’s “Dambuster” (Operation Chastise, 17 May 1943) raids on German dams.  Because of the way water behaved, it wasn’t necessary to score the “direct hit” which had been the ideal in the early days of aerial warfare.

RAF Bomber Command archive photograph of Avro Lancaster (built between 1941-1946) in flight with Grand Slam mounted (left) and a comparison of the Tallboy & Grand Slam (right), illustrating how the latter was in most respects a scaled-up version of the former.  To carry the big Grand Slams, 32 “B1 Special” Lancasters were in 1945 built with up-rated Rolls-Royce Merlin V12 engines, the removal of the bomb doors (the Grand Slam carried externally, its dimensions exceeding internal capacity), deleted front and mid-upper gun turrets, no radar equipment and a strengthened undercarriage.  Such was the concern with weight (especially for take-off) that just about anything non-essential was removed from the B1 Specials, even three of the four fire axes and its crew door ladder.  In the US, Boeing went through a similar exercise to produce the run of “Silverplate” B-29 Superfortresses able to carry the first A-bombs used in August, 1945. 

Best known of the British devices were the so called earthquake bombs”, the Tallboy (12,000 lb; 5.4 ton) & Grand Slam (22,000 lb, 10 ton) which, despite the impressive bulk, were classified by the War Office as “medium capacity”.  The terms “Medium Capacity” (MC) & “High Capacity” referenced not the gross weight or physical dimensions but ratio of explosive filler to the total weight of the construction (ie how much was explosive compared to the casing and ancillary components).  Because both had thick casings to ensure penetration deep into hardened targets (bunkers and other structures encased in rock or reinforced concrete) before exploding, the internal dimensions accordingly were reduced compared with the ratio typical of contemporary ordinance.  A High Capacity (HC) bomb (a typical “general-purpose bomb) had a thinner casing and a much higher proportion of explosive (sometimes over 70% of total weight).  These were intended for area bombing (known also as “carpet bombing”) and caused wide blast damage whereas the Tallboy & Grand Slam were penetrative with casings optimized for aerodynamic efficiency, their supersonic travel working as a mass-multiplier.  The Tallboy’s 5,200 lb (2.3 ton) explosive load was some 43% of its gross weight while the Grand Slam’s 9,100 lb (4 ton) absorbed 41%; this may be compared with the “big” 4000 lb (1.8 ton) HC “Blockbuster” which allocated 75% of the gross weight to its 3000 LB (1.4 ton) charge.  Like many things in engineering (not just in military matters) the ratio represented a trade-off, the MC design prioritizing penetrative power and structural destruction over blast radius.  The novelty of the Tallboy & Grand Slam was that as earthquake bombs, their destructive potential was able to be unleashed not necessarily by achieving a direct hit on a target but by entering the ground nearby, the explosion (1) creating an underground cavity (a camouflet) and (2) transmitting a shock-wave through the target’s foundations, leading to the structure collapsing into the newly created lacuna. 

The etymology of camouflet has an interesting history in both French and military mining.  Originally it meant “a whiff of smoke in the face (from a fire or pipe) and in figurative use it was a reference to a snub or slight insult (something unpleasant delivered directly to someone) and although the origin is murky and it may have been related to the earlier French verb camoufler (to disguise; to mask) which evolved also into “camouflage”.  In the specialized military jargon of siege warfare or mining (sapping), over the seventeen and nineteenth centuries “camouflet” referred to “an underground explosion that does not break the surface, but collapses enemy tunnels or fortifications by creating a subterranean void or shockwave”.  The use of this tactic is best remembered from the Western Front in World War I, some of the huge craters now tourist attractions.

Under watchful eyes: Grand Ayatollah Ali Khamenei (b 1939; Supreme Leader, Islamic Republic of Iran since 1989) delivering a speech, sitting in front of the official portrait of the republic’s ever-unsmiling founder, Grand Ayatollah Ruhollah Khomeini (1900-1989; Supreme Leader, Islamic Republic of Iran, 1979-1989).  Ayatollah Khamenei seemed in 1989 an improbable choice as Supreme Leader because others were better credentialed but though cautious and uncharismatic, he has proved a great survivor in a troubled region.

Since aerial bombing began to be used as a strategic weapon, of great interest has been the debate over the BDA (battle damage assessment) and this issue emerged almost as soon as the bunker buster attack on Iran was announced, focused on the extent to which the MOPs had damaged the targets, the deepest of which were concealed deep inside a mountain.  BDA is a constantly evolving science and while satellites have made analysis of surface damage highly refined, it’s more difficult to understand what has happened deep underground.  Indeed, it wasn’t until the USSBS (United States Strategic Bombing Survey) teams toured Germany and Japan in 1945-1946, conducting interviews, economic analysis and site surveys that a useful (and substantially accurate) understanding emerged of the effectiveness of bombing although what technological advances have allowed for those with the resources is the so-called “panacea targets” (ie critical infrastructure and such once dismissed by planners because the required precision was for many reasons rarely attainable) can now accurately be targeted, the USAF able to drop a bomb within a few feet of the aiming point.  As the phrase is used by the military, the Fordow Uranium Enrichment Plant is as classic “panacea target” but whether even a technically successful strike will achieve the desired political outcome remains to be seen.

Mr Trump, in a moment of exasperation, posted on Truth Social of Iran & Israel: “We basically have two countries that have been fighting so long and so hard that they don't know what the fuck they're doing."  Actually, both know exactly WTF they're doing; it's just Mr Trump (and many others) would prefer they didn't do it.

Donald Trump (b 1946; US president 2017-2021 and since 2025) claimed “total obliteration” of the targets while Grand Ayatollah Khamenei admitted only there had been “some damage” and which is closer to the truth should one day be revealed.  Even modelling of the effects has probably been inconclusive because the deeper one goes underground, the greater the number of variables in the natural structure and the nature of the internal built environment will also influence blast behaviour.  All experts seem to agree much damage will have been done but what can’t yet be determined is what has been suffered by the facilities which sit as deep as 80 m (260 feet) inside the mountain although, as the name implies, “bunker busters” are designed for buried targets and it’s not always required for blast directly to reach target.  Because the shock-wave can travel through earth & rock, the effect is something like that of an earthquake and if the structure sufficiently is affected, it may be the area can be rendered geologically too unstable again to be used for its original purpose.

Within minutes of the bombing having been announced, legal academics were being interviewed (though not by Fox News) to explain why the attacks were unlawful under international law and in a sign of the times, the White House didn't bother to discuss fine legal points like the distinction between "preventive & pre-emptive strikes", preferring (like Fox News) to focus on the damage done.  However, whatever the murkiness surrounding the BDA, many analysts have concluded that even if before the attacks the Iranian authorities had not approved the creation of a nuclear weapon, this attack will have persuaded them one is essential for “regime survival”, thus the interest in both Tel Aviv and (despite denials) Washington DC in “regime change”.  The consensus seems to be Grand Ayatollah Khamenei had, prior to the strike, not ordered the creation of a nuclear weapon but that all energies were directed towards completing the preliminary steps, thus the enriching of uranium to ten times the level required for use in power generation; the ayatollah liked to keep his options open.  So, the fear of some is the attacks, even if they have (by weeks, months or years) delayed the Islamic Republic’s work on nuclear development, may prove counter-productive in that they convince the ayatollah to concur with the reasoning of every state which since 1945 has adopted an independent nuclear deterrent (IND).  That reasoning was not complex and hasn’t changed since first a prehistoric man picked up a stout stick to wave as a pre-lingual message to potential adversaries, warning them there would be consequences for aggression.  Although a theocracy, those who command power in the Islamic Republic are part of an opaque political institution and in the struggle which has for sometime been conducted in anticipation of the death of the aged (and reportedly ailing) Supreme Leader, the matter of “an Iranian IND” is one of the central dynamics.  Many will be following what unfolds in Tehran and the observers will not be only in Tel Aviv and Washington DC because in the region and beyond, few things focus the mind like the thought of ayatollahs with A-Bombs.

Of the word "bust"

The Great Bust: The Depression of the Thirties (1962) by Jack Lang (left), highly qualified content provider Busty Buffy (b 1996, who has never been accused of misleading advertising, centre) and The people's champion, Mr Lang, bust of Jack Lang, painted cast plaster by an unknown artist, circa 1927, National Portrait Gallery, Canberra, Australia (right).  Remembered for a few things, Jack Lang (1876–1975; premier of the Australian state of New South Wales (NSW) 1925-1927 & 1930-1932) remains best known for having in 1932 been the first head of government in the British Empire to have been sacked by the Crown since William IV (1765–1837; King of the UK 1830-1837) in 1834 dismissed Lord Melbourne (1779–1848; prime minister of the UK 1834 & 1835-1841).

Those learning English must think it at least careless things can both be (1) “razed to the ground” (totally to destroy something (typically a structure), usually by demolition or incineration) and (2) “raised to the sky” (physically lifted upwards).  The etymologies of “raze” and “raise” differ but they’re pronounced the same so it’s fortunate the spellings vary but in other troublesome examples of unrelated meanings, spelling and pronunciation can align, as in “bust”.  When used in ways most directly related to human anatomy: (1) “a sculptural portrayal of a person's head and shoulders” & (2) “the circumference of a woman's chest around her breasts” there is an etymological link but these uses wholly are unconnected with bust’s other senses.

Bust of Lindsay Lohan in white marble by Stable Diffusion.  Sculptures of just the neck and head came also to be called “busts”, the emphasis on the technique rather than the original definition.

Bust in the sense of “a sculpture of upper torso and head” dates from the 1690s and was from the sixteenth century French buste, from the Italian busto (upper body; torso), from the Latin bustum (funeral monument, tomb (although the original sense was “funeral pyre, place where corpses are burned”)) and it may have emerged (as a shortened form) from ambustum, neuter of ambustus (burned around), past participle of amburere (burn around, scorch), the construct being ambi- (around) + urere (to burn),  The alternative etymology traces a link to the Old Latin boro, the early form of the Classical Latin uro (to burn) and it’s though the development in Italian was influenced by the Etruscan custom of keeping the ashes of the dead in an urn shaped like the person when alive.  Thus the use, common by the 1720s of bust (a clipping from the French buste) being “a carving of the “trunk of the human body from the chest up”.  From this came the meaning “dimension of the bosom; the measurement around a woman's body at the level of her breasts” and that evolved on the basis of a comparison with the sculptures, the base of which was described as the “bust-line”, the term still used in dress-making (and for other comparative purposes as one of the three “vital statistics” by which women are judged (bust, waist, hips), each circumference having an “ideal range”).  It’s not known when “bust” and “bust-line” came into oral use among dress-makers and related professions but it’s documented since the 1880s.  Derived forms (sometimes hyphenated) include busty (tending to bustiness, thus Busty Buffy's choice of stage-name), overbust & underbust (technical terms in women's fashion referencing specific measurements) and bustier (a tight-fitting women's top which covers (most or all of) the bust.

Benito Mussolini (1883-1945; Duce (leader) & Prime-Minister of Italy 1922-1943) standing beside his “portrait bust” (1926).

The bust was carved by Swiss sculptor Ernest Durig (1894–1962) who gained posthumous notoriety when his career as a forger was revealed with the publication of his drawings which he’d represented as being from the hand of the French sculptor Auguste Rodin (1840-1917) under whom he claimed to have studied.  Mussolini appears here in one of the subsequently much caricatured poses which were a part of his personality cult.  More than one of the Duce's counterparts in other nations was known to have made fun of some of the more outré poses and affectations, the outstretched chin, right hand braced against the hip and straddle-legged stance among the popular motifs. 

“Portrait bust” in marble (circa 1895) of (1815-1989; chancellor of the German Empire (the "Second Reich") 1871-1890) by the German Sculptor Reinhold Begas (1831-1911).

 In sculpture, what had been known as the “portrait statue” came after the 1690s to be known as the “portrait bust” although both terms meant “sculpture of upper torso and head” and these proved a popular choice for military figures because the aspect enabled the inclusion of bling such as epaulettes, medals and other decorations and being depictions of the human figure, busts came to be vested with special significance by the superstitious.  In early 1939, during construction of the new Reich Chancellery in Berlin, workmen dropped one of the busts of Otto von Bismarck by Reinhold Begas, breaking it at the neck.  For decades, the bust had sat in the old Chancellery and the building’s project manager, Albert Speer (1905–1981; Nazi court architect 1934-1942; Nazi minister of armaments and war production 1942-1945), knowing Adolf Hitler (1889-1945; Führer (leader) and German head of government 1933-1945 & head of state 1934-1945) believed the Reich Eagle toppling from the post-office building right at the beginning of World War I had been a harbinger of doom for the nation, kept the accident secret, hurriedly issuing a commission to the German sculptor Arno Breker (1900–1991) who carved an exact copy.  To give the fake the necessary patina, it was soaked for a time in strong, black tea, the porous quality of marble enabling the fluid to induce some accelerated aging.  Interestingly, in his (sometimes reliable) memoir (Erinnerungen (Memories or Reminiscences) and published in English as Inside the Third Reich (1969)), even the technocratic Speer admitted of the accident: “I felt this as an evil omen”.

The other senses of bust (as a noun, verb & adjective) are diverse (and sometimes diametric opposites and include: “to break or fail”; “to be caught doing something unlawful / illicit / disgusting etc”; “to debunk”; “dramatically or unexpectedly to succeed”; “to go broke”; “to break in (horses, girlfriends etc): “to assault”; the downward portion of an economic cycle (ie “boom & bust”); “the act of effecting an arrest” and “someone (especially in professional sport) who failed to perform to expectation”.  That’s quite a range and that has meant the creation of dozens of idiomatic forms, the best known of which include: “boom & bust”, “busted flush”, “dambuster”, “bunker buster”,  “busted arse country”, “drug bust”, “cloud bust”, belly-busting, bust one's ass (or butt), bust a gut, bust a move, bust a nut, bust-down, bust loose, bust off, bust one's balls, bust-out, sod buster, bust the dust, myth-busting and trend-busting.  In the sense of “breaking through”, bust was from the Middle English busten, a variant of bursten & bresten (to burst) and may be compared with the Low German basten & barsten (to burst).  Bust in the sense of “break”, “smash”, “fail”, “arrest” etc was a creation of mid-nineteenth century US English and is of uncertain inspiration but most etymologists seem to concur it was likely a modification of “burst” effected with a phonetic alteration but it’s not impossible it came directly as an imperfect echoic of Germanic speech.  The apparent contradiction of bust meaning both “fail” and “dramatically succeed” happened because the former was an allusion to “being busted” (ie broken) while the latter meaning used the notion of “busting through”.